Mar 13 11:46:51 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 11:46:51 crc restorecon[4759]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:51 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:46:52 crc restorecon[4759]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 11:46:53 crc kubenswrapper[4786]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:46:53 crc kubenswrapper[4786]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 11:46:53 crc kubenswrapper[4786]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:46:53 crc kubenswrapper[4786]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:46:53 crc kubenswrapper[4786]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 11:46:53 crc kubenswrapper[4786]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.174196 4786 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181798 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181823 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181832 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181838 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181844 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181850 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181856 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181862 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181867 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181873 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181898 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181913 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181919 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181924 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181931 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181937 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181943 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181949 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181954 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181959 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181964 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181969 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181974 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181980 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181984 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181989 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181994 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.181999 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182004 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182008 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182014 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182019 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182024 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182028 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182033 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182038 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182043 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182048 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182055 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182060 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182069 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182077 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182083 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182090 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182096 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182103 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182108 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182124 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182129 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182134 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182139 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182144 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182149 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182153 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182158 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182163 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182168 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182173 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182177 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182183 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182187 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182192 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182199 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182203 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182208 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182215 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182220 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182224 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182229 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182234 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.182239 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182368 4786 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182380 4786 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182391 4786 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182398 4786 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182406 4786 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182412 4786 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182420 4786 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182427 4786 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182433 4786 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182439 4786 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182445 4786 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182452 4786 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182457 4786 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182463 4786 flags.go:64] FLAG: --cgroup-root="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182469 4786 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182475 4786 flags.go:64] FLAG: --client-ca-file="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182480 4786 flags.go:64] FLAG: --cloud-config="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182486 4786 flags.go:64] FLAG: --cloud-provider="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182491 4786 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182498 4786 flags.go:64] FLAG: --cluster-domain="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182503 4786 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182509 4786 flags.go:64] FLAG: --config-dir="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182515 4786 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182522 4786 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182529 4786 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182535 4786 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182541 4786 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182547 4786 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182553 4786 flags.go:64] FLAG: --contention-profiling="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182558 4786 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182564 4786 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182570 4786 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182575 4786 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182583 4786 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182589 4786 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182595 4786 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182601 4786 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182606 4786 flags.go:64] FLAG: --enable-server="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182612 4786 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182620 4786 flags.go:64] FLAG: --event-burst="100" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182626 4786 flags.go:64] FLAG: --event-qps="50" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182632 4786 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182639 4786 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182644 4786 flags.go:64] FLAG: --eviction-hard="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182651 4786 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182657 4786 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182662 4786 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182668 4786 flags.go:64] FLAG: --eviction-soft="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182673 4786 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182679 4786 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182685 4786 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182690 4786 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182696 4786 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182701 4786 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182707 4786 flags.go:64] FLAG: --feature-gates="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182714 4786 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182720 4786 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182726 4786 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182731 4786 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182737 4786 flags.go:64] FLAG: --healthz-port="10248" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182753 4786 flags.go:64] FLAG: --help="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182759 4786 flags.go:64] FLAG: --hostname-override="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182764 4786 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182770 4786 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182775 4786 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182781 4786 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182787 4786 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182792 4786 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182798 4786 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182803 4786 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182809 4786 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182815 4786 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182821 4786 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182826 4786 flags.go:64] FLAG: --kube-reserved="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182832 4786 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182837 4786 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182843 4786 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182848 4786 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182854 4786 flags.go:64] FLAG: --lock-file="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182860 4786 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182866 4786 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182871 4786 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182898 4786 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182904 4786 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182910 4786 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182916 4786 flags.go:64] FLAG: --logging-format="text" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182921 4786 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182928 4786 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182934 4786 flags.go:64] FLAG: --manifest-url="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182939 4786 flags.go:64] FLAG: --manifest-url-header="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182947 4786 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182952 4786 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182960 4786 flags.go:64] FLAG: --max-pods="110" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182966 4786 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182971 4786 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182977 4786 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182983 4786 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182989 4786 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.182995 4786 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183001 4786 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183013 4786 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183019 4786 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183025 4786 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183031 4786 flags.go:64] FLAG: --pod-cidr="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183037 4786 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183045 4786 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183051 4786 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183056 4786 flags.go:64] FLAG: --pods-per-core="0" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183062 4786 flags.go:64] FLAG: --port="10250" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183068 4786 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183074 4786 flags.go:64] FLAG: --provider-id="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183080 4786 flags.go:64] FLAG: --qos-reserved="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183085 4786 flags.go:64] FLAG: --read-only-port="10255" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183091 4786 flags.go:64] FLAG: --register-node="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183098 4786 flags.go:64] FLAG: --register-schedulable="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183103 4786 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183117 4786 flags.go:64] FLAG: --registry-burst="10" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183122 4786 flags.go:64] FLAG: --registry-qps="5" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183129 4786 flags.go:64] FLAG: --reserved-cpus="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183135 4786 flags.go:64] FLAG: --reserved-memory="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183142 4786 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183148 4786 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183154 4786 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183159 4786 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183165 4786 flags.go:64] FLAG: --runonce="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183171 4786 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183176 4786 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183182 4786 flags.go:64] FLAG: --seccomp-default="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183187 4786 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183193 4786 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183199 4786 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183206 4786 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183211 4786 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183217 4786 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183222 4786 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183228 4786 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183233 4786 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183240 4786 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183246 4786 flags.go:64] FLAG: --system-cgroups="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183251 4786 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183261 4786 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183268 4786 flags.go:64] FLAG: --tls-cert-file="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183274 4786 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183282 4786 flags.go:64] FLAG: --tls-min-version="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183287 4786 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183293 4786 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183299 4786 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183304 4786 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183310 4786 flags.go:64] FLAG: --v="2" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183317 4786 flags.go:64] FLAG: --version="false" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183325 4786 flags.go:64] FLAG: --vmodule="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183335 4786 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183341 4786 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183472 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183478 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183484 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183490 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183497 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183503 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183508 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183514 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183519 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183526 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183532 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183538 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183543 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183549 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183554 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183558 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183563 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183568 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183573 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183578 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183583 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183588 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183593 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183598 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183603 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183608 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183613 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183617 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183622 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183627 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183634 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183638 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183643 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183649 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183654 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183660 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183666 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183672 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183677 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183682 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183687 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183695 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183700 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183705 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183710 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183715 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183720 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183725 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183730 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183735 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183742 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183748 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183754 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183760 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183767 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183773 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183779 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183785 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183790 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183795 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183800 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183829 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183839 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183845 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183851 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183857 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183868 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183899 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183906 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183915 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.183922 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.183933 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.197704 4786 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.197737 4786 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197862 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197872 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197918 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197929 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197939 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197947 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197956 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197964 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197972 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197981 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.197989 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198000 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198012 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198022 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198032 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198040 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198049 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198057 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198065 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198074 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198082 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198091 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198099 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198108 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198116 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198125 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198133 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198142 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198150 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198158 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198169 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198179 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198206 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198215 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198223 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198232 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198239 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198247 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198255 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198263 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198270 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198278 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198286 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198293 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198301 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198309 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198317 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198324 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198332 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198340 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198348 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198358 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198369 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198377 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198387 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198395 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198405 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198413 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198421 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198429 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198440 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198449 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198458 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198467 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198475 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198484 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198491 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198499 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198507 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198514 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198522 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.198535 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198756 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198770 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198778 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198787 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198796 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198804 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198812 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198820 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198829 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198837 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198845 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198853 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198861 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198869 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198900 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198908 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198916 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198924 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198932 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198940 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198947 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198955 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198963 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198971 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198980 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198987 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.198995 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199003 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199010 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199021 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199032 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199039 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199050 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199060 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199068 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199076 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199085 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199096 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199105 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199114 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199122 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199130 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199137 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199145 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199153 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199161 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199168 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199176 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199184 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199191 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199199 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199207 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199214 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199222 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199230 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199238 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199246 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199254 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199262 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199269 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199277 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199286 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199293 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199301 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199311 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199322 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199331 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199340 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199348 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199356 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.199365 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.199377 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.200585 4786 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.205065 4786 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.209972 4786 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.210142 4786 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.212245 4786 server.go:997] "Starting client certificate rotation" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.212293 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.212602 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.241529 4786 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.244854 4786 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.245318 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.264738 4786 log.go:25] "Validated CRI v1 runtime API" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.309017 4786 log.go:25] "Validated CRI v1 image API" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.311460 4786 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.316734 4786 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-11-41-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.316779 4786 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.349283 4786 manager.go:217] Machine: {Timestamp:2026-03-13 11:46:53.345515408 +0000 UTC m=+0.625168925 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ed5189ac-f697-4058-b82e-47ba3df6ef92 BootID:9070ab03-ef9a-4d2e-b143-43ffad1cba05 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:03:fa:27 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:03:fa:27 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8a:b0:61 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cd:f7:54 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f9:db:ce Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ed:7f:a1 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:9f:37:f9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:dc:12:6e:4e:5e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:e0:f9:32:d3:ea Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.349718 4786 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.349978 4786 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.350409 4786 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.350743 4786 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.350806 4786 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.351199 4786 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.351219 4786 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.352084 4786 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.352139 4786 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.353210 4786 state_mem.go:36] "Initialized new in-memory state store" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.353351 4786 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.358982 4786 kubelet.go:418] "Attempting to sync node with API server" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.359020 4786 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.359060 4786 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.359083 4786 kubelet.go:324] "Adding apiserver pod source" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.359102 4786 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.364090 4786 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.365622 4786 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.366943 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.366959 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.367045 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.367058 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.367322 4786 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369019 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369060 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369075 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369090 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369111 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369151 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369167 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369198 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369221 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369242 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369301 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.369315 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.370385 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.371231 4786 server.go:1280] "Started kubelet" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.372597 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:53 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.373215 4786 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.373312 4786 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.374144 4786 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.375319 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.375403 4786 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.376167 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.376297 4786 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.376321 4786 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.376517 4786 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.381298 4786 factory.go:55] Registering systemd factory Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.381548 4786 factory.go:221] Registration of the systemd container factory successfully Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.381821 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.382367 4786 factory.go:153] Registering CRI-O factory Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.382425 4786 factory.go:221] Registration of the crio container factory successfully Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.382553 4786 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.382595 4786 factory.go:103] Registering Raw factory Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.382629 4786 manager.go:1196] Started watching for new ooms in manager Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.381492 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c64195ccefa77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,LastTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.381952 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.390870 4786 manager.go:319] Starting recovery of all containers Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.392000 4786 server.go:460] "Adding debug handlers to kubelet server" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.393083 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.399836 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.399954 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.399983 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.400008 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.400034 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.400061 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402222 4786 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402277 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402294 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402309 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402321 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402333 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402342 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402353 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402366 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402375 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402391 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402401 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402412 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402423 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402433 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402446 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402456 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402467 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402476 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402486 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402496 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402507 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402520 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402536 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402550 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402596 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402606 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402615 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402626 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402636 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402645 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402662 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402673 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402683 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402693 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402702 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402721 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402732 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402744 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402754 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402764 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402774 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402784 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402794 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402805 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402816 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402827 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402841 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402853 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402864 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402893 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402909 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402919 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402930 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402940 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402951 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402961 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402970 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402981 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402990 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.402999 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403008 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403018 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403028 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403037 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403046 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403056 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403065 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403073 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403082 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403091 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403100 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403108 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403117 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403126 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403134 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403143 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403158 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403168 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403178 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403189 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403199 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403210 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403219 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403230 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403241 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403252 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403263 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403272 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403283 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403292 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403302 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403312 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403323 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403333 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403344 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403358 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403367 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403377 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403397 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403409 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403426 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403437 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403447 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403458 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403469 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403484 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403495 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403504 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403515 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403525 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403535 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403564 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403573 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403584 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403593 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403602 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403612 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403621 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403630 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403639 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403649 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403659 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403668 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403677 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403687 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403696 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403709 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403721 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403730 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403744 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403754 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403763 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403772 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403782 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403792 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403802 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403812 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403821 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403830 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403840 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403849 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403858 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403867 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403893 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403905 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403913 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403923 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403931 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403940 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403949 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403958 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403967 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403976 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403984 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.403994 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404005 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404014 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404024 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404034 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404046 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404055 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404065 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404073 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404082 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404092 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404103 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404112 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404122 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404132 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404287 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404297 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404312 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404321 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404330 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404346 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404356 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404366 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404376 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404385 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404394 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404403 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404414 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404423 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404432 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404484 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404500 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404509 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404543 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404553 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404564 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404593 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404604 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404614 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404623 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404632 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404642 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404670 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404684 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404718 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404731 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404741 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404750 4786 reconstruct.go:97] "Volume reconstruction finished" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.404758 4786 reconciler.go:26] "Reconciler: start to sync state" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.425129 4786 manager.go:324] Recovery completed Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.436802 4786 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.439184 4786 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.439248 4786 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.439282 4786 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.439436 4786 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 11:46:53 crc kubenswrapper[4786]: W0313 11:46:53.440721 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.440792 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.442872 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.444288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.444335 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.444353 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.445190 4786 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.445206 4786 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.445226 4786 state_mem.go:36] "Initialized new in-memory state store" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.469617 4786 policy_none.go:49] "None policy: Start" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.471251 4786 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.471304 4786 state_mem.go:35] "Initializing new in-memory state store" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.476308 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.539690 4786 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.541699 4786 manager.go:334] "Starting Device Plugin manager" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.541997 4786 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.542031 4786 server.go:79] "Starting device plugin registration server" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.542661 4786 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.542696 4786 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.542864 4786 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.543069 4786 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.543091 4786 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.555697 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.594655 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.643365 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.645147 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.645197 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.645215 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.645248 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.646174 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.740680 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.740828 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.742372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.742440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.742459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.742671 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.743002 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.743100 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.743979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744203 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744429 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744493 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.744615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745264 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745287 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745470 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745554 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745596 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.745638 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746630 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746648 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746840 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746920 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.746980 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748109 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748150 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748428 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.748459 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.749233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.749262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.749273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809210 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809260 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809293 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809324 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809360 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809375 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809397 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809412 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809426 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809456 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809505 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809521 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.809535 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.847285 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.848443 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.848492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.848508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.848538 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.848966 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910620 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910687 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910703 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910766 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910783 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910814 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910831 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910847 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910863 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910903 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910917 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.910987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911020 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911051 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911154 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911273 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911314 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911345 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911361 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: I0313 11:46:53.911393 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:53 crc kubenswrapper[4786]: E0313 11:46:53.995383 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.078340 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.095573 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.104641 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.128519 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-9d63fba13b335d463adaccbbc61b8d56609349f139fd53767f0551d628583fbf WatchSource:0}: Error finding container 9d63fba13b335d463adaccbbc61b8d56609349f139fd53767f0551d628583fbf: Status 404 returned error can't find the container with id 9d63fba13b335d463adaccbbc61b8d56609349f139fd53767f0551d628583fbf Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.130673 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-71dc071234b1090c7b00eb21d31e8f63f5a811debdfcc893bb0cac904b2054f0 WatchSource:0}: Error finding container 71dc071234b1090c7b00eb21d31e8f63f5a811debdfcc893bb0cac904b2054f0: Status 404 returned error can't find the container with id 71dc071234b1090c7b00eb21d31e8f63f5a811debdfcc893bb0cac904b2054f0 Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.132000 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.133300 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0ccf2605245a22a801ee6303b83fd18c030429aba6506647f3dfd1650f45acc8 WatchSource:0}: Error finding container 0ccf2605245a22a801ee6303b83fd18c030429aba6506647f3dfd1650f45acc8: Status 404 returned error can't find the container with id 0ccf2605245a22a801ee6303b83fd18c030429aba6506647f3dfd1650f45acc8 Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.139214 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.159475 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8ddc012722f818e6af25522bf5eacb57a10ae0f08aefa3330c9475d79e8ec293 WatchSource:0}: Error finding container 8ddc012722f818e6af25522bf5eacb57a10ae0f08aefa3330c9475d79e8ec293: Status 404 returned error can't find the container with id 8ddc012722f818e6af25522bf5eacb57a10ae0f08aefa3330c9475d79e8ec293 Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.249666 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.251331 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.251385 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.251403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.251437 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:46:54 crc kubenswrapper[4786]: E0313 11:46:54.251953 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.364943 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:54 crc kubenswrapper[4786]: E0313 11:46:54.365069 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.374212 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.413345 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:54 crc kubenswrapper[4786]: E0313 11:46:54.413479 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.444327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"71dc071234b1090c7b00eb21d31e8f63f5a811debdfcc893bb0cac904b2054f0"} Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.446473 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d63fba13b335d463adaccbbc61b8d56609349f139fd53767f0551d628583fbf"} Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.448718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8ddc012722f818e6af25522bf5eacb57a10ae0f08aefa3330c9475d79e8ec293"} Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.450183 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"24aeaa11498957cfbd2feb5f2df9545d055fac22c43ff29113587b9b63dc1bf3"} Mar 13 11:46:54 crc kubenswrapper[4786]: I0313 11:46:54.451107 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0ccf2605245a22a801ee6303b83fd18c030429aba6506647f3dfd1650f45acc8"} Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.592383 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:54 crc kubenswrapper[4786]: E0313 11:46:54.592497 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:54 crc kubenswrapper[4786]: W0313 11:46:54.624916 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:54 crc kubenswrapper[4786]: E0313 11:46:54.625177 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:54 crc kubenswrapper[4786]: E0313 11:46:54.796578 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.052973 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.056171 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.056215 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.056233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.056267 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:46:55 crc kubenswrapper[4786]: E0313 11:46:55.056855 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.374046 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.433450 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:46:55 crc kubenswrapper[4786]: E0313 11:46:55.434596 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.455637 4786 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3" exitCode=0 Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.455731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.455775 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.456930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.456962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.456974 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.457597 4786 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116" exitCode=0 Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.457635 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.457698 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.458704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.458744 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.458760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.461016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.461055 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.461071 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.463124 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706" exitCode=0 Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.463197 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.463241 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.464899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.464939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.464957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.465583 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.465632 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584" exitCode=0 Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.465691 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584"} Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.466626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.466660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.466672 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.469053 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.470213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.470243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:55 crc kubenswrapper[4786]: I0313 11:46:55.470254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:56 crc kubenswrapper[4786]: W0313 11:46:56.212449 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:56 crc kubenswrapper[4786]: E0313 11:46:56.212607 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.373920 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:56 crc kubenswrapper[4786]: E0313 11:46:56.398415 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.469412 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806" exitCode=0 Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.469621 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.469858 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.470759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.470906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.471009 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.473619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.473742 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.474487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.474509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.474517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.475462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.475607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.475722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.475484 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.476658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.476777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.476896 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.476924 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.476967 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.477585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.477605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.477613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.478956 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.478995 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.479007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302"} Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.479017 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b"} Mar 13 11:46:56 crc kubenswrapper[4786]: W0313 11:46:56.542136 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:46:56 crc kubenswrapper[4786]: E0313 11:46:56.542220 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.658019 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.659160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.659191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.659202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:56 crc kubenswrapper[4786]: I0313 11:46:56.659223 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:46:56 crc kubenswrapper[4786]: E0313 11:46:56.659780 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.484690 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e" exitCode=0 Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.484783 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e"} Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.484971 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.486065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.486115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.486131 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.490277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2233374b255e18caf5c4539eb2a5bc6ae604dc2f0027e7893d5ee13d9808a2d"} Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.490379 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.490402 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.490422 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.490442 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.490531 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492161 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492295 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492951 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.492990 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.493009 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.493078 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.493126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.493146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:57 crc kubenswrapper[4786]: I0313 11:46:57.575747 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.086626 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.380811 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502700 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290"} Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502749 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502760 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502827 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502767 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198"} Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502925 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.502928 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c"} Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.503075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352"} Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504373 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504415 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504766 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:58 crc kubenswrapper[4786]: I0313 11:46:58.504783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.271769 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.512607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e"} Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.512645 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.513589 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.514831 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.514939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.514966 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.515296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.515330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.515344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.557257 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.632234 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.632544 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.632654 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.634846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.634912 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.634930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.673323 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.860297 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.862067 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.862149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.862192 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:46:59 crc kubenswrapper[4786]: I0313 11:46:59.862245 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.515926 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.516147 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.550121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.550206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.550214 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.550257 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.550277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.550225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.824011 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.831160 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.851053 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.971158 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.971363 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.973204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.973285 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:00 crc kubenswrapper[4786]: I0313 11:47:00.973305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.518248 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.518248 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.520785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.520836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.520849 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.520968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.521020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:01 crc kubenswrapper[4786]: I0313 11:47:01.521041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.272170 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.272308 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.447004 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.447226 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.448784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.448847 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.448857 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.521089 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.521222 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.521850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.521874 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.521900 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.522403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.522424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:02 crc kubenswrapper[4786]: I0313 11:47:02.522432 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:03 crc kubenswrapper[4786]: E0313 11:47:03.556103 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.375067 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 13 11:47:07 crc kubenswrapper[4786]: W0313 11:47:07.409974 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.410068 4786 trace.go:236] Trace[617082187]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 11:46:57.408) (total time: 10001ms): Mar 13 11:47:07 crc kubenswrapper[4786]: Trace[617082187]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:47:07.409) Mar 13 11:47:07 crc kubenswrapper[4786]: Trace[617082187]: [10.00122325s] [10.00122325s] END Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.410092 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 11:47:07 crc kubenswrapper[4786]: W0313 11:47:07.532384 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.532516 4786 trace.go:236] Trace[1591660662]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 11:46:57.530) (total time: 10001ms): Mar 13 11:47:07 crc kubenswrapper[4786]: Trace[1591660662]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:47:07.532) Mar 13 11:47:07 crc kubenswrapper[4786]: Trace[1591660662]: [10.001781091s] [10.001781091s] END Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.532544 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.906390 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:47:07 crc kubenswrapper[4786]: W0313 11:47:07.908073 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.908151 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:07 crc kubenswrapper[4786]: W0313 11:47:07.910212 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.910260 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.912438 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c64195ccefa77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,LastTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.916790 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 11:47:07 crc kubenswrapper[4786]: E0313 11:47:07.917816 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.921138 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.921188 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.927364 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:47:07 crc kubenswrapper[4786]: I0313 11:47:07.927431 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.089972 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.090332 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.091605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.091640 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.091651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.376315 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:08Z is after 2026-02-23T05:33:13Z Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.539085 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.541183 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2233374b255e18caf5c4539eb2a5bc6ae604dc2f0027e7893d5ee13d9808a2d" exitCode=255 Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.541236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c2233374b255e18caf5c4539eb2a5bc6ae604dc2f0027e7893d5ee13d9808a2d"} Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.541436 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.542663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.542708 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.542720 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:08 crc kubenswrapper[4786]: I0313 11:47:08.543315 4786 scope.go:117] "RemoveContainer" containerID="c2233374b255e18caf5c4539eb2a5bc6ae604dc2f0027e7893d5ee13d9808a2d" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.379949 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:09Z is after 2026-02-23T05:33:13Z Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.546552 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.549780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6"} Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.549927 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.551233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.551305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.551326 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.578072 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.578226 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.580557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.580602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.580618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.595545 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 11:47:09 crc kubenswrapper[4786]: I0313 11:47:09.637844 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.379446 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:10Z is after 2026-02-23T05:33:13Z Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.555656 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.556278 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.559903 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" exitCode=255 Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.560020 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6"} Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.560053 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.560073 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.560087 4786 scope.go:117] "RemoveContainer" containerID="c2233374b255e18caf5c4539eb2a5bc6ae604dc2f0027e7893d5ee13d9808a2d" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.561423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.561464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.561479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.561633 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.561684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.561702 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.562521 4786 scope.go:117] "RemoveContainer" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" Mar 13 11:47:10 crc kubenswrapper[4786]: E0313 11:47:10.562815 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.566483 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:10 crc kubenswrapper[4786]: I0313 11:47:10.971845 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.376661 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:11Z is after 2026-02-23T05:33:13Z Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.565193 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.567446 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.568225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.568284 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.568305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:11 crc kubenswrapper[4786]: I0313 11:47:11.569152 4786 scope.go:117] "RemoveContainer" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" Mar 13 11:47:11 crc kubenswrapper[4786]: E0313 11:47:11.569439 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.272623 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.272696 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:47:12 crc kubenswrapper[4786]: W0313 11:47:12.350397 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:12Z is after 2026-02-23T05:33:13Z Mar 13 11:47:12 crc kubenswrapper[4786]: E0313 11:47:12.350476 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.376190 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:12Z is after 2026-02-23T05:33:13Z Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.569606 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.570482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.570522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.570532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:12 crc kubenswrapper[4786]: I0313 11:47:12.571057 4786 scope.go:117] "RemoveContainer" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" Mar 13 11:47:12 crc kubenswrapper[4786]: E0313 11:47:12.571216 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:12 crc kubenswrapper[4786]: W0313 11:47:12.844248 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:12Z is after 2026-02-23T05:33:13Z Mar 13 11:47:12 crc kubenswrapper[4786]: E0313 11:47:12.844363 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:13 crc kubenswrapper[4786]: I0313 11:47:13.376638 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:13Z is after 2026-02-23T05:33:13Z Mar 13 11:47:13 crc kubenswrapper[4786]: E0313 11:47:13.557084 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.306831 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.308249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.308315 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.308331 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.308356 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:14 crc kubenswrapper[4786]: E0313 11:47:14.311808 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:14Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:47:14 crc kubenswrapper[4786]: E0313 11:47:14.320144 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:14Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.376977 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:14Z is after 2026-02-23T05:33:13Z Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.504748 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.504966 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.505934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.505969 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.505977 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:14 crc kubenswrapper[4786]: I0313 11:47:14.506420 4786 scope.go:117] "RemoveContainer" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" Mar 13 11:47:14 crc kubenswrapper[4786]: E0313 11:47:14.506565 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:15 crc kubenswrapper[4786]: I0313 11:47:15.376400 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:15Z is after 2026-02-23T05:33:13Z Mar 13 11:47:16 crc kubenswrapper[4786]: I0313 11:47:16.299754 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:47:16 crc kubenswrapper[4786]: E0313 11:47:16.305294 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:16 crc kubenswrapper[4786]: I0313 11:47:16.375985 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:16Z is after 2026-02-23T05:33:13Z Mar 13 11:47:16 crc kubenswrapper[4786]: W0313 11:47:16.955402 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:16Z is after 2026-02-23T05:33:13Z Mar 13 11:47:16 crc kubenswrapper[4786]: E0313 11:47:16.955549 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:17 crc kubenswrapper[4786]: I0313 11:47:17.378846 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:17Z is after 2026-02-23T05:33:13Z Mar 13 11:47:17 crc kubenswrapper[4786]: E0313 11:47:17.919045 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:17Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c64195ccefa77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,LastTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:18 crc kubenswrapper[4786]: W0313 11:47:18.142359 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:18Z is after 2026-02-23T05:33:13Z Mar 13 11:47:18 crc kubenswrapper[4786]: E0313 11:47:18.142451 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:18 crc kubenswrapper[4786]: I0313 11:47:18.378614 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:18Z is after 2026-02-23T05:33:13Z Mar 13 11:47:19 crc kubenswrapper[4786]: I0313 11:47:19.377549 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:19Z is after 2026-02-23T05:33:13Z Mar 13 11:47:20 crc kubenswrapper[4786]: I0313 11:47:20.378162 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:20Z is after 2026-02-23T05:33:13Z Mar 13 11:47:21 crc kubenswrapper[4786]: I0313 11:47:21.312966 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:21 crc kubenswrapper[4786]: I0313 11:47:21.314675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:21 crc kubenswrapper[4786]: I0313 11:47:21.314928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:21 crc kubenswrapper[4786]: I0313 11:47:21.315105 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:21 crc kubenswrapper[4786]: I0313 11:47:21.315314 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:21 crc kubenswrapper[4786]: E0313 11:47:21.318982 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:47:21 crc kubenswrapper[4786]: E0313 11:47:21.324591 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:47:21 crc kubenswrapper[4786]: I0313 11:47:21.377947 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z Mar 13 11:47:21 crc kubenswrapper[4786]: W0313 11:47:21.479864 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z Mar 13 11:47:21 crc kubenswrapper[4786]: E0313 11:47:21.480909 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.272490 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.272588 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.272694 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.272989 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.274963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.275050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.275069 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.275844 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.276203 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63" gracePeriod=30 Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.378427 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:22Z is after 2026-02-23T05:33:13Z Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.599319 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.600113 4786 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63" exitCode=255 Mar 13 11:47:22 crc kubenswrapper[4786]: I0313 11:47:22.600173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63"} Mar 13 11:47:22 crc kubenswrapper[4786]: W0313 11:47:22.662626 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:22Z is after 2026-02-23T05:33:13Z Mar 13 11:47:22 crc kubenswrapper[4786]: E0313 11:47:22.662738 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.378595 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:23Z is after 2026-02-23T05:33:13Z Mar 13 11:47:23 crc kubenswrapper[4786]: E0313 11:47:23.557202 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.605853 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.607243 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a"} Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.607344 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.608522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.608574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:23 crc kubenswrapper[4786]: I0313 11:47:23.608591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:24 crc kubenswrapper[4786]: I0313 11:47:24.378075 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:24Z is after 2026-02-23T05:33:13Z Mar 13 11:47:24 crc kubenswrapper[4786]: I0313 11:47:24.609344 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:24 crc kubenswrapper[4786]: I0313 11:47:24.610734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:24 crc kubenswrapper[4786]: I0313 11:47:24.610781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:24 crc kubenswrapper[4786]: I0313 11:47:24.610793 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:25 crc kubenswrapper[4786]: I0313 11:47:25.378387 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:25Z is after 2026-02-23T05:33:13Z Mar 13 11:47:26 crc kubenswrapper[4786]: I0313 11:47:26.378467 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:26Z is after 2026-02-23T05:33:13Z Mar 13 11:47:27 crc kubenswrapper[4786]: I0313 11:47:27.376539 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:27Z is after 2026-02-23T05:33:13Z Mar 13 11:47:27 crc kubenswrapper[4786]: I0313 11:47:27.576393 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:27 crc kubenswrapper[4786]: I0313 11:47:27.576550 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:27 crc kubenswrapper[4786]: I0313 11:47:27.577737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:27 crc kubenswrapper[4786]: I0313 11:47:27.577805 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:27 crc kubenswrapper[4786]: I0313 11:47:27.577820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:27 crc kubenswrapper[4786]: E0313 11:47:27.925115 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:27Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c64195ccefa77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,LastTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:28 crc kubenswrapper[4786]: I0313 11:47:28.319999 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:28 crc kubenswrapper[4786]: I0313 11:47:28.321535 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:28 crc kubenswrapper[4786]: I0313 11:47:28.321670 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:28 crc kubenswrapper[4786]: I0313 11:47:28.321748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:28 crc kubenswrapper[4786]: I0313 11:47:28.321837 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:28 crc kubenswrapper[4786]: E0313 11:47:28.324842 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:47:28 crc kubenswrapper[4786]: E0313 11:47:28.329634 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:47:28 crc kubenswrapper[4786]: I0313 11:47:28.377877 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:28Z is after 2026-02-23T05:33:13Z Mar 13 11:47:29 crc kubenswrapper[4786]: I0313 11:47:29.272555 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:29 crc kubenswrapper[4786]: I0313 11:47:29.272780 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:29 crc kubenswrapper[4786]: I0313 11:47:29.274515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:29 crc kubenswrapper[4786]: I0313 11:47:29.274572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:29 crc kubenswrapper[4786]: I0313 11:47:29.274597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:29 crc kubenswrapper[4786]: I0313 11:47:29.377066 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:29Z is after 2026-02-23T05:33:13Z Mar 13 11:47:30 crc kubenswrapper[4786]: I0313 11:47:30.375925 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:30Z is after 2026-02-23T05:33:13Z Mar 13 11:47:30 crc kubenswrapper[4786]: I0313 11:47:30.439849 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:30 crc kubenswrapper[4786]: I0313 11:47:30.441685 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:30 crc kubenswrapper[4786]: I0313 11:47:30.441735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:30 crc kubenswrapper[4786]: I0313 11:47:30.441749 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:30 crc kubenswrapper[4786]: I0313 11:47:30.442651 4786 scope.go:117] "RemoveContainer" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.377809 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:31Z is after 2026-02-23T05:33:13Z Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.631619 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.632377 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.634394 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a" exitCode=255 Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.634447 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a"} Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.634681 4786 scope.go:117] "RemoveContainer" containerID="8aff95ddf5e5cbfa94050f9dd80edb7741bee3db85d0e99b1029c88fdbd3dde6" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.634906 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.636284 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.636408 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.636582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:31 crc kubenswrapper[4786]: I0313 11:47:31.641139 4786 scope.go:117] "RemoveContainer" containerID="489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a" Mar 13 11:47:31 crc kubenswrapper[4786]: E0313 11:47:31.641424 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:32 crc kubenswrapper[4786]: I0313 11:47:32.273579 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:47:32 crc kubenswrapper[4786]: I0313 11:47:32.273635 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:47:32 crc kubenswrapper[4786]: I0313 11:47:32.381381 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:32Z is after 2026-02-23T05:33:13Z Mar 13 11:47:32 crc kubenswrapper[4786]: I0313 11:47:32.639070 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:47:33 crc kubenswrapper[4786]: I0313 11:47:33.377537 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:33Z is after 2026-02-23T05:33:13Z Mar 13 11:47:33 crc kubenswrapper[4786]: E0313 11:47:33.557368 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:47:33 crc kubenswrapper[4786]: I0313 11:47:33.685593 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:47:33 crc kubenswrapper[4786]: E0313 11:47:33.691508 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:33 crc kubenswrapper[4786]: E0313 11:47:33.692798 4786 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 13 11:47:33 crc kubenswrapper[4786]: W0313 11:47:33.820975 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:33Z is after 2026-02-23T05:33:13Z Mar 13 11:47:33 crc kubenswrapper[4786]: E0313 11:47:33.821284 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.378326 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:34Z is after 2026-02-23T05:33:13Z Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.504973 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.505324 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.507114 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.507156 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.507168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:34 crc kubenswrapper[4786]: I0313 11:47:34.507756 4786 scope.go:117] "RemoveContainer" containerID="489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a" Mar 13 11:47:34 crc kubenswrapper[4786]: E0313 11:47:34.507957 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:35 crc kubenswrapper[4786]: I0313 11:47:35.325771 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:35 crc kubenswrapper[4786]: I0313 11:47:35.327067 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:35 crc kubenswrapper[4786]: I0313 11:47:35.327102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:35 crc kubenswrapper[4786]: I0313 11:47:35.327112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:35 crc kubenswrapper[4786]: I0313 11:47:35.327142 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:35 crc kubenswrapper[4786]: E0313 11:47:35.332319 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:47:35 crc kubenswrapper[4786]: E0313 11:47:35.333300 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:47:35 crc kubenswrapper[4786]: I0313 11:47:35.375730 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:35Z is after 2026-02-23T05:33:13Z Mar 13 11:47:36 crc kubenswrapper[4786]: I0313 11:47:36.379229 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:36Z is after 2026-02-23T05:33:13Z Mar 13 11:47:37 crc kubenswrapper[4786]: I0313 11:47:37.376111 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:37Z is after 2026-02-23T05:33:13Z Mar 13 11:47:37 crc kubenswrapper[4786]: W0313 11:47:37.734319 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:37Z is after 2026-02-23T05:33:13Z Mar 13 11:47:37 crc kubenswrapper[4786]: E0313 11:47:37.734395 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:37 crc kubenswrapper[4786]: E0313 11:47:37.928444 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c64195ccefa77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,LastTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:38 crc kubenswrapper[4786]: I0313 11:47:38.378290 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:38Z is after 2026-02-23T05:33:13Z Mar 13 11:47:39 crc kubenswrapper[4786]: W0313 11:47:39.321855 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:39Z is after 2026-02-23T05:33:13Z Mar 13 11:47:39 crc kubenswrapper[4786]: E0313 11:47:39.322013 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:39 crc kubenswrapper[4786]: I0313 11:47:39.378796 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:39Z is after 2026-02-23T05:33:13Z Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.377870 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:40Z is after 2026-02-23T05:33:13Z Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.972058 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.972337 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.973993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.974038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.974058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:40 crc kubenswrapper[4786]: I0313 11:47:40.974929 4786 scope.go:117] "RemoveContainer" containerID="489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a" Mar 13 11:47:40 crc kubenswrapper[4786]: E0313 11:47:40.975227 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:41 crc kubenswrapper[4786]: W0313 11:47:41.125486 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:41Z is after 2026-02-23T05:33:13Z Mar 13 11:47:41 crc kubenswrapper[4786]: E0313 11:47:41.125564 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:47:41 crc kubenswrapper[4786]: I0313 11:47:41.379748 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:41Z is after 2026-02-23T05:33:13Z Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.272246 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.272327 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.332846 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.334086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.334118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.334173 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.334204 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:42 crc kubenswrapper[4786]: E0313 11:47:42.337315 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:47:42 crc kubenswrapper[4786]: E0313 11:47:42.338140 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:47:42 crc kubenswrapper[4786]: I0313 11:47:42.380180 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:42Z is after 2026-02-23T05:33:13Z Mar 13 11:47:43 crc kubenswrapper[4786]: I0313 11:47:43.377988 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:43Z is after 2026-02-23T05:33:13Z Mar 13 11:47:43 crc kubenswrapper[4786]: E0313 11:47:43.557645 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:47:44 crc kubenswrapper[4786]: I0313 11:47:44.377412 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:45 crc kubenswrapper[4786]: I0313 11:47:45.380518 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:46 crc kubenswrapper[4786]: I0313 11:47:46.380829 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:47 crc kubenswrapper[4786]: I0313 11:47:47.382456 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.937253 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c64195ccefa77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,LastTimestamp:2026-03-13 11:46:53.371161207 +0000 UTC m=+0.650814714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.945049 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.951949 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.961101 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.968345 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419673f9715 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.546313493 +0000 UTC m=+0.825966940,LastTimestamp:2026-03-13 11:46:53.546313493 +0000 UTC m=+0.825966940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.975777 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.645179782 +0000 UTC m=+0.924833259,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.985021 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.645209073 +0000 UTC m=+0.924862560,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:47 crc kubenswrapper[4786]: E0313 11:47:47.993996 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bf061\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.645224073 +0000 UTC m=+0.924877550,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.004452 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.742415232 +0000 UTC m=+1.022068719,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.013494 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.742452342 +0000 UTC m=+1.022105829,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.020178 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bf061\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.742469153 +0000 UTC m=+1.022122640,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.027839 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.744009741 +0000 UTC m=+1.023663218,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.033950 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.744031061 +0000 UTC m=+1.023684548,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.040524 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bf061\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.744046771 +0000 UTC m=+1.023700248,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.046634 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.744571722 +0000 UTC m=+1.024225209,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.050983 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.744608482 +0000 UTC m=+1.024261969,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.054132 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bf061\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.744625392 +0000 UTC m=+1.024278879,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.059800 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.745251673 +0000 UTC m=+1.024905160,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.066780 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.745278684 +0000 UTC m=+1.024932171,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.074196 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bf061\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.745302324 +0000 UTC m=+1.024955811,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.082185 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.745548629 +0000 UTC m=+1.025202076,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.089559 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.745561209 +0000 UTC m=+1.025214666,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.096717 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bf061\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bf061 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444362337 +0000 UTC m=+0.724015824,LastTimestamp:2026-03-13 11:46:53.745569819 +0000 UTC m=+1.025223266,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.104750 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612b5ca4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612b5ca4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444324516 +0000 UTC m=+0.723977993,LastTimestamp:2026-03-13 11:46:53.746609128 +0000 UTC m=+1.026262605,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.111618 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6419612bb438\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6419612bb438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:53.444346936 +0000 UTC m=+0.724000413,LastTimestamp:2026-03-13 11:46:53.746641489 +0000 UTC m=+1.026294966,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.120286 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c64198a668846 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.136068166 +0000 UTC m=+1.415721653,LastTimestamp:2026-03-13 11:46:54.136068166 +0000 UTC m=+1.415721653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.127363 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c64198a67d6e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.136153827 +0000 UTC m=+1.415807324,LastTimestamp:2026-03-13 11:46:54.136153827 +0000 UTC m=+1.415807324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.134113 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c64198a798acf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.137313999 +0000 UTC m=+1.416967466,LastTimestamp:2026-03-13 11:46:54.137313999 +0000 UTC m=+1.416967466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.142431 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c64198b140f32 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.147440434 +0000 UTC m=+1.427093921,LastTimestamp:2026-03-13 11:46:54.147440434 +0000 UTC m=+1.427093921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.149981 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c64198c2bcbdf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.165773279 +0000 UTC m=+1.445426736,LastTimestamp:2026-03-13 11:46:54.165773279 +0000 UTC m=+1.445426736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.156948 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419ad8f2841 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.725933121 +0000 UTC m=+2.005586598,LastTimestamp:2026-03-13 11:46:54.725933121 +0000 UTC m=+2.005586598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.161221 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419adafe365 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.728078181 +0000 UTC m=+2.007731668,LastTimestamp:2026-03-13 11:46:54.728078181 +0000 UTC m=+2.007731668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.163826 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419adb4ee83 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.728408707 +0000 UTC m=+2.008062194,LastTimestamp:2026-03-13 11:46:54.728408707 +0000 UTC m=+2.008062194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.167719 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6419adb9b918 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.728722712 +0000 UTC m=+2.008376189,LastTimestamp:2026-03-13 11:46:54.728722712 +0000 UTC m=+2.008376189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.174093 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6419ae2df684 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.736340612 +0000 UTC m=+2.015994089,LastTimestamp:2026-03-13 11:46:54.736340612 +0000 UTC m=+2.015994089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.180811 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419ae590d9e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.739164574 +0000 UTC m=+2.018818061,LastTimestamp:2026-03-13 11:46:54.739164574 +0000 UTC m=+2.018818061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.186446 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419ae9b496f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.743505263 +0000 UTC m=+2.023158710,LastTimestamp:2026-03-13 11:46:54.743505263 +0000 UTC m=+2.023158710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.193349 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419aeb7b941 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.745368897 +0000 UTC m=+2.025022344,LastTimestamp:2026-03-13 11:46:54.745368897 +0000 UTC m=+2.025022344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.199817 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419aec86543 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.746461507 +0000 UTC m=+2.026114954,LastTimestamp:2026-03-13 11:46:54.746461507 +0000 UTC m=+2.026114954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.204587 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6419af2281d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.752367065 +0000 UTC m=+2.032020512,LastTimestamp:2026-03-13 11:46:54.752367065 +0000 UTC m=+2.032020512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.213044 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6419af3492f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.753551097 +0000 UTC m=+2.033204544,LastTimestamp:2026-03-13 11:46:54.753551097 +0000 UTC m=+2.033204544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.219228 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419c191f9ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.061662126 +0000 UTC m=+2.341315663,LastTimestamp:2026-03-13 11:46:55.061662126 +0000 UTC m=+2.341315663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.225093 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419c26ac0ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.075868846 +0000 UTC m=+2.355522323,LastTimestamp:2026-03-13 11:46:55.075868846 +0000 UTC m=+2.355522323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.232482 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419c282da72 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.077448306 +0000 UTC m=+2.357101793,LastTimestamp:2026-03-13 11:46:55.077448306 +0000 UTC m=+2.357101793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.239874 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419d196fc9c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.330426012 +0000 UTC m=+2.610079499,LastTimestamp:2026-03-13 11:46:55.330426012 +0000 UTC m=+2.610079499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.244596 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419d2699502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.344227586 +0000 UTC m=+2.623881033,LastTimestamp:2026-03-13 11:46:55.344227586 +0000 UTC m=+2.623881033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.251388 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419d284de6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.346015854 +0000 UTC m=+2.625669331,LastTimestamp:2026-03-13 11:46:55.346015854 +0000 UTC m=+2.625669331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.256459 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6419d93e1af9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.458818809 +0000 UTC m=+2.738472296,LastTimestamp:2026-03-13 11:46:55.458818809 +0000 UTC m=+2.738472296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.264052 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419d95614ae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.460390062 +0000 UTC m=+2.740043539,LastTimestamp:2026-03-13 11:46:55.460390062 +0000 UTC m=+2.740043539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.268299 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419d9d72170 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.468847472 +0000 UTC m=+2.748500919,LastTimestamp:2026-03-13 11:46:55.468847472 +0000 UTC m=+2.748500919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.272953 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6419d9e42e7d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.469702781 +0000 UTC m=+2.749356248,LastTimestamp:2026-03-13 11:46:55.469702781 +0000 UTC m=+2.749356248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.279572 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419e014d494 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.573554324 +0000 UTC m=+2.853207771,LastTimestamp:2026-03-13 11:46:55.573554324 +0000 UTC m=+2.853207771,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.286451 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419e1141658 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.59028284 +0000 UTC m=+2.869936287,LastTimestamp:2026-03-13 11:46:55.59028284 +0000 UTC m=+2.869936287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.294224 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419e5fe5276 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.672742518 +0000 UTC m=+2.952395965,LastTimestamp:2026-03-13 11:46:55.672742518 +0000 UTC m=+2.952395965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.301276 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6419e61dfde6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.674818022 +0000 UTC m=+2.954471469,LastTimestamp:2026-03-13 11:46:55.674818022 +0000 UTC m=+2.954471469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.305516 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6419e61f3b0d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.674899213 +0000 UTC m=+2.954552660,LastTimestamp:2026-03-13 11:46:55.674899213 +0000 UTC m=+2.954552660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.310026 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419e635d8dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.676381405 +0000 UTC m=+2.956034842,LastTimestamp:2026-03-13 11:46:55.676381405 +0000 UTC m=+2.956034842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.314332 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419e6a29687 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.683507847 +0000 UTC m=+2.963161294,LastTimestamp:2026-03-13 11:46:55.683507847 +0000 UTC m=+2.963161294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.320662 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419e6b48d10 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.684685072 +0000 UTC m=+2.964338519,LastTimestamp:2026-03-13 11:46:55.684685072 +0000 UTC m=+2.964338519,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.327080 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6419e6f3d5ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.68883249 +0000 UTC m=+2.968485927,LastTimestamp:2026-03-13 11:46:55.68883249 +0000 UTC m=+2.968485927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.332738 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419e7242cc9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.692000457 +0000 UTC m=+2.971653904,LastTimestamp:2026-03-13 11:46:55.692000457 +0000 UTC m=+2.971653904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.337105 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419e7360567 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.693170023 +0000 UTC m=+2.972823470,LastTimestamp:2026-03-13 11:46:55.693170023 +0000 UTC m=+2.972823470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.343105 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6419e79676a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.699490468 +0000 UTC m=+2.979143915,LastTimestamp:2026-03-13 11:46:55.699490468 +0000 UTC m=+2.979143915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.347816 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419f2445bf0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.878659056 +0000 UTC m=+3.158312503,LastTimestamp:2026-03-13 11:46:55.878659056 +0000 UTC m=+3.158312503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.355175 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419f250ccd9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.879474393 +0000 UTC m=+3.159127840,LastTimestamp:2026-03-13 11:46:55.879474393 +0000 UTC m=+3.159127840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.360343 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419f362ae4c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.897423436 +0000 UTC m=+3.177076883,LastTimestamp:2026-03-13 11:46:55.897423436 +0000 UTC m=+3.177076883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.364391 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419f37a3a1e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.898966558 +0000 UTC m=+3.178620045,LastTimestamp:2026-03-13 11:46:55.898966558 +0000 UTC m=+3.178620045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.369224 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419f387c198 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.899853208 +0000 UTC m=+3.179506695,LastTimestamp:2026-03-13 11:46:55.899853208 +0000 UTC m=+3.179506695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: I0313 11:47:48.374589 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.374596 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419f39e6f11 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.901339409 +0000 UTC m=+3.180992896,LastTimestamp:2026-03-13 11:46:55.901339409 +0000 UTC m=+3.180992896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.380756 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419fd2f2b2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.06181969 +0000 UTC m=+3.341473137,LastTimestamp:2026-03-13 11:46:56.06181969 +0000 UTC m=+3.341473137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: I0313 11:47:48.385176 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:47:48 crc kubenswrapper[4786]: I0313 11:47:48.385299 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.385751 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419fd4f77bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.063936444 +0000 UTC m=+3.343589891,LastTimestamp:2026-03-13 11:46:56.063936444 +0000 UTC m=+3.343589891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: I0313 11:47:48.387300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:48 crc kubenswrapper[4786]: I0313 11:47:48.387343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:48 crc kubenswrapper[4786]: I0313 11:47:48.387355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.392179 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6419fdeb8e2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.074165802 +0000 UTC m=+3.353819249,LastTimestamp:2026-03-13 11:46:56.074165802 +0000 UTC m=+3.353819249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.398432 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419fe203f02 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.077618946 +0000 UTC m=+3.357272413,LastTimestamp:2026-03-13 11:46:56.077618946 +0000 UTC m=+3.357272413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.403224 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6419fe31b3ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.078762991 +0000 UTC m=+3.358416448,LastTimestamp:2026-03-13 11:46:56.078762991 +0000 UTC m=+3.358416448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.408518 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a09695bf5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.266959861 +0000 UTC m=+3.546613338,LastTimestamp:2026-03-13 11:46:56.266959861 +0000 UTC m=+3.546613338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.415005 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a0a1b85c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.27863597 +0000 UTC m=+3.558289407,LastTimestamp:2026-03-13 11:46:56.27863597 +0000 UTC m=+3.558289407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.420713 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a0a3284ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.280143103 +0000 UTC m=+3.559796550,LastTimestamp:2026-03-13 11:46:56.280143103 +0000 UTC m=+3.559796550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.425152 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a15a54690 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.472213136 +0000 UTC m=+3.751866593,LastTimestamp:2026-03-13 11:46:56.472213136 +0000 UTC m=+3.751866593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.430347 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a163d3394 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.482169748 +0000 UTC m=+3.761823185,LastTimestamp:2026-03-13 11:46:56.482169748 +0000 UTC m=+3.761823185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.434288 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a16e613d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.493237204 +0000 UTC m=+3.772890651,LastTimestamp:2026-03-13 11:46:56.493237204 +0000 UTC m=+3.772890651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.440776 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a1fc2c626 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.641918502 +0000 UTC m=+3.921571949,LastTimestamp:2026-03-13 11:46:56.641918502 +0000 UTC m=+3.921571949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.445444 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a20582639 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.651707961 +0000 UTC m=+3.931361408,LastTimestamp:2026-03-13 11:46:56.651707961 +0000 UTC m=+3.931361408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.452582 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a523322c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:57.488143046 +0000 UTC m=+4.767796533,LastTimestamp:2026-03-13 11:46:57.488143046 +0000 UTC m=+4.767796533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.458348 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a602ef153 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:57.722749267 +0000 UTC m=+5.002402764,LastTimestamp:2026-03-13 11:46:57.722749267 +0000 UTC m=+5.002402764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.464666 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a61175749 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:57.737979721 +0000 UTC m=+5.017633198,LastTimestamp:2026-03-13 11:46:57.737979721 +0000 UTC m=+5.017633198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.469597 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a61288956 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:57.739106646 +0000 UTC m=+5.018760133,LastTimestamp:2026-03-13 11:46:57.739106646 +0000 UTC m=+5.018760133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.474514 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a707c913f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:57.996271935 +0000 UTC m=+5.275925392,LastTimestamp:2026-03-13 11:46:57.996271935 +0000 UTC m=+5.275925392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.481810 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a71904cf7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.014342391 +0000 UTC m=+5.293995878,LastTimestamp:2026-03-13 11:46:58.014342391 +0000 UTC m=+5.293995878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.488337 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a71a175c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.015466945 +0000 UTC m=+5.295120442,LastTimestamp:2026-03-13 11:46:58.015466945 +0000 UTC m=+5.295120442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.494739 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a7fd23535 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.253542709 +0000 UTC m=+5.533196196,LastTimestamp:2026-03-13 11:46:58.253542709 +0000 UTC m=+5.533196196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.502236 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a80cab482 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.269828226 +0000 UTC m=+5.549481713,LastTimestamp:2026-03-13 11:46:58.269828226 +0000 UTC m=+5.549481713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.515200 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a80e2655d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.271380829 +0000 UTC m=+5.551034316,LastTimestamp:2026-03-13 11:46:58.271380829 +0000 UTC m=+5.551034316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.522561 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a8d22ebd5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.476936149 +0000 UTC m=+5.756589636,LastTimestamp:2026-03-13 11:46:58.476936149 +0000 UTC m=+5.756589636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.529409 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a8e095171 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.492035441 +0000 UTC m=+5.771688918,LastTimestamp:2026-03-13 11:46:58.492035441 +0000 UTC m=+5.771688918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.536340 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a8e22b7b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.493700016 +0000 UTC m=+5.773353503,LastTimestamp:2026-03-13 11:46:58.493700016 +0000 UTC m=+5.773353503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.543590 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a9e95de8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.769682059 +0000 UTC m=+6.049335536,LastTimestamp:2026-03-13 11:46:58.769682059 +0000 UTC m=+6.049335536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.550528 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c641a9fa6166e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:58.787522158 +0000 UTC m=+6.067175635,LastTimestamp:2026-03-13 11:46:58.787522158 +0000 UTC m=+6.067175635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.560290 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c641b6f5af39c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 11:47:48 crc kubenswrapper[4786]: body: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:02.272258972 +0000 UTC m=+9.551912459,LastTimestamp:2026-03-13 11:47:02.272258972 +0000 UTC m=+9.551912459,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.566990 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c641b6f5c9474 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:02.272365684 +0000 UTC m=+9.552019171,LastTimestamp:2026-03-13 11:47:02.272365684 +0000 UTC m=+9.552019171,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.574046 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-apiserver-crc.189c641cc00e7661 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 11:47:48 crc kubenswrapper[4786]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:47:48 crc kubenswrapper[4786]: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:07.921167969 +0000 UTC m=+15.200821406,LastTimestamp:2026-03-13 11:47:07.921167969 +0000 UTC m=+15.200821406,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.582735 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641cc00f0e7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:07.921206909 +0000 UTC m=+15.200860356,LastTimestamp:2026-03-13 11:47:07.921206909 +0000 UTC m=+15.200860356,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.588045 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c641cc00e7661\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-apiserver-crc.189c641cc00e7661 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 11:47:48 crc kubenswrapper[4786]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:47:48 crc kubenswrapper[4786]: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:07.921167969 +0000 UTC m=+15.200821406,LastTimestamp:2026-03-13 11:47:07.927407451 +0000 UTC m=+15.207060898,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.592785 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c641cc00f0e7d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641cc00f0e7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:07.921206909 +0000 UTC m=+15.200860356,LastTimestamp:2026-03-13 11:47:07.927451942 +0000 UTC m=+15.207105389,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.600816 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c641a0a3284ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a0a3284ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.280143103 +0000 UTC m=+3.559796550,LastTimestamp:2026-03-13 11:47:08.544283818 +0000 UTC m=+15.823937265,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.607795 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c641a163d3394\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a163d3394 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.482169748 +0000 UTC m=+3.761823185,LastTimestamp:2026-03-13 11:47:08.735094554 +0000 UTC m=+16.014748041,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.613600 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c641a16e613d4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c641a16e613d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:56.493237204 +0000 UTC m=+3.772890651,LastTimestamp:2026-03-13 11:47:08.771740195 +0000 UTC m=+16.051393662,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.622013 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36d2b91 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:47:48 crc kubenswrapper[4786]: body: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272673681 +0000 UTC m=+19.552327128,LastTimestamp:2026-03-13 11:47:12.272673681 +0000 UTC m=+19.552327128,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.628583 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36ddbd2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272718802 +0000 UTC m=+19.552372239,LastTimestamp:2026-03-13 11:47:12.272718802 +0000 UTC m=+19.552372239,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.635038 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c641dc36d2b91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36d2b91 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:47:48 crc kubenswrapper[4786]: body: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272673681 +0000 UTC m=+19.552327128,LastTimestamp:2026-03-13 11:47:22.272558339 +0000 UTC m=+29.552211826,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.641941 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c641dc36ddbd2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36ddbd2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272718802 +0000 UTC m=+19.552372239,LastTimestamp:2026-03-13 11:47:22.272649211 +0000 UTC m=+29.552302718,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.648419 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642017ae4d90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:22.27616296 +0000 UTC m=+29.555816487,LastTimestamp:2026-03-13 11:47:22.27616296 +0000 UTC m=+29.555816487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.655163 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6419aec86543\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419aec86543 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:54.746461507 +0000 UTC m=+2.026114954,LastTimestamp:2026-03-13 11:47:22.396780449 +0000 UTC m=+29.676433986,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.661830 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6419c191f9ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419c191f9ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.061662126 +0000 UTC m=+2.341315663,LastTimestamp:2026-03-13 11:47:22.612392094 +0000 UTC m=+29.892045541,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.668144 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6419c26ac0ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6419c26ac0ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:46:55.075868846 +0000 UTC m=+2.355522323,LastTimestamp:2026-03-13 11:47:22.621669372 +0000 UTC m=+29.901322829,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.678267 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c641dc36d2b91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36d2b91 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:47:48 crc kubenswrapper[4786]: body: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272673681 +0000 UTC m=+19.552327128,LastTimestamp:2026-03-13 11:47:32.273616999 +0000 UTC m=+39.553270446,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.684987 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c641dc36ddbd2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36ddbd2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272718802 +0000 UTC m=+19.552372239,LastTimestamp:2026-03-13 11:47:32.27365518 +0000 UTC m=+39.553308627,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:47:48 crc kubenswrapper[4786]: E0313 11:47:48.692764 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c641dc36d2b91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:47:48 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c641dc36d2b91 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:47:48 crc kubenswrapper[4786]: body: Mar 13 11:47:48 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:47:12.272673681 +0000 UTC m=+19.552327128,LastTimestamp:2026-03-13 11:47:42.272307518 +0000 UTC m=+49.551960995,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:47:48 crc kubenswrapper[4786]: > Mar 13 11:47:49 crc kubenswrapper[4786]: I0313 11:47:49.338274 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:49 crc kubenswrapper[4786]: I0313 11:47:49.339601 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:49 crc kubenswrapper[4786]: I0313 11:47:49.339639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:49 crc kubenswrapper[4786]: I0313 11:47:49.339659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:49 crc kubenswrapper[4786]: I0313 11:47:49.339684 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:49 crc kubenswrapper[4786]: E0313 11:47:49.344291 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:47:49 crc kubenswrapper[4786]: E0313 11:47:49.344806 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:47:49 crc kubenswrapper[4786]: I0313 11:47:49.374504 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:50 crc kubenswrapper[4786]: I0313 11:47:50.380499 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:51 crc kubenswrapper[4786]: I0313 11:47:51.374849 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.085361 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.085529 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.087032 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.087100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.087128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.091701 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.385288 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.692538 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.693305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.693357 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:52 crc kubenswrapper[4786]: I0313 11:47:52.693369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.376752 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.439727 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.440925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.440971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.440981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.441492 4786 scope.go:117] "RemoveContainer" containerID="489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a" Mar 13 11:47:53 crc kubenswrapper[4786]: E0313 11:47:53.608539 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.699644 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:47:53 crc kubenswrapper[4786]: I0313 11:47:53.704429 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69"} Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.382615 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.709595 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.710181 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.712255 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" exitCode=255 Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.712309 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69"} Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.712358 4786 scope.go:117] "RemoveContainer" containerID="489aad1d5e6bcb64f532c82235551c0b558999f1c3d8d1cb4651c422dcbbab0a" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.712366 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.714013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.714046 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.714058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:54 crc kubenswrapper[4786]: I0313 11:47:54.714632 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:47:54 crc kubenswrapper[4786]: E0313 11:47:54.714812 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.378874 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.716297 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.718266 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.719488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.719536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.719553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:55 crc kubenswrapper[4786]: I0313 11:47:55.720270 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:47:55 crc kubenswrapper[4786]: E0313 11:47:55.720524 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:47:56 crc kubenswrapper[4786]: I0313 11:47:56.344820 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:47:56 crc kubenswrapper[4786]: I0313 11:47:56.346691 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:47:56 crc kubenswrapper[4786]: I0313 11:47:56.346737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:47:56 crc kubenswrapper[4786]: I0313 11:47:56.346749 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:47:56 crc kubenswrapper[4786]: I0313 11:47:56.346776 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:47:56 crc kubenswrapper[4786]: E0313 11:47:56.350362 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:47:56 crc kubenswrapper[4786]: E0313 11:47:56.350798 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:47:56 crc kubenswrapper[4786]: I0313 11:47:56.374861 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:57 crc kubenswrapper[4786]: I0313 11:47:57.378058 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:58 crc kubenswrapper[4786]: I0313 11:47:58.377365 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:47:59 crc kubenswrapper[4786]: I0313 11:47:59.378982 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.380589 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.971766 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.971929 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.972858 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.972906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.972916 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:00 crc kubenswrapper[4786]: I0313 11:48:00.973419 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:48:00 crc kubenswrapper[4786]: E0313 11:48:00.973570 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:01 crc kubenswrapper[4786]: I0313 11:48:01.380200 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:02 crc kubenswrapper[4786]: I0313 11:48:02.378394 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:03 crc kubenswrapper[4786]: I0313 11:48:03.350833 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:03 crc kubenswrapper[4786]: I0313 11:48:03.357787 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:03 crc kubenswrapper[4786]: I0313 11:48:03.357871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:03 crc kubenswrapper[4786]: I0313 11:48:03.357939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:03 crc kubenswrapper[4786]: I0313 11:48:03.357995 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:03 crc kubenswrapper[4786]: E0313 11:48:03.359797 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:48:03 crc kubenswrapper[4786]: E0313 11:48:03.360081 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:48:03 crc kubenswrapper[4786]: I0313 11:48:03.378166 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:03 crc kubenswrapper[4786]: E0313 11:48:03.609773 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.378186 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.504961 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.505199 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.506677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.506727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.506746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:04 crc kubenswrapper[4786]: I0313 11:48:04.507531 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:48:04 crc kubenswrapper[4786]: E0313 11:48:04.507828 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:05 crc kubenswrapper[4786]: I0313 11:48:05.380482 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:05 crc kubenswrapper[4786]: I0313 11:48:05.695066 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:05 crc kubenswrapper[4786]: I0313 11:48:05.715202 4786 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 11:48:06 crc kubenswrapper[4786]: I0313 11:48:06.378097 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:07 crc kubenswrapper[4786]: I0313 11:48:07.381720 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:08 crc kubenswrapper[4786]: I0313 11:48:08.378726 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:08 crc kubenswrapper[4786]: I0313 11:48:08.866027 4786 csr.go:261] certificate signing request csr-s7jrh is approved, waiting to be issued Mar 13 11:48:08 crc kubenswrapper[4786]: I0313 11:48:08.878152 4786 csr.go:257] certificate signing request csr-s7jrh is issued Mar 13 11:48:08 crc kubenswrapper[4786]: I0313 11:48:08.896590 4786 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 11:48:09 crc kubenswrapper[4786]: I0313 11:48:09.213070 4786 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 11:48:09 crc kubenswrapper[4786]: I0313 11:48:09.879536 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 20:27:14.276933353 +0000 UTC Mar 13 11:48:09 crc kubenswrapper[4786]: I0313 11:48:09.879607 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6488h39m4.397331332s for next certificate rotation Mar 13 11:48:09 crc kubenswrapper[4786]: I0313 11:48:09.935263 4786 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.359956 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.361417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.361477 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.361497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.361684 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.378415 4786 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.378720 4786 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.378754 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.382147 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.382200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.382219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.382660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.382727 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:10Z","lastTransitionTime":"2026-03-13T11:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.402948 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.412907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.413000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.413022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.413047 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.413073 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:10Z","lastTransitionTime":"2026-03-13T11:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.430717 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.435295 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.435348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.435365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.435389 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.435407 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:10Z","lastTransitionTime":"2026-03-13T11:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.449859 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.454093 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.454154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.454177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.454204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:10 crc kubenswrapper[4786]: I0313 11:48:10.454225 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:10Z","lastTransitionTime":"2026-03-13T11:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.469968 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.470427 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.470474 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.570875 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.671024 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.772001 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.872373 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:10 crc kubenswrapper[4786]: E0313 11:48:10.972703 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.073070 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.173876 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.275107 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.375592 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.476317 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.576720 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.677017 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.777791 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.878956 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:11 crc kubenswrapper[4786]: E0313 11:48:11.979985 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.080727 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.181707 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.282000 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.383012 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.483321 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.584542 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.685673 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.786764 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.887946 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:12 crc kubenswrapper[4786]: E0313 11:48:12.988710 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.089630 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.190199 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.290706 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.391026 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.491970 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.592519 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.610752 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.693094 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.793564 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.894784 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:13 crc kubenswrapper[4786]: E0313 11:48:13.994964 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.096052 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.196164 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.296590 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.396719 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.497397 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.597995 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.698562 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.799147 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:14 crc kubenswrapper[4786]: E0313 11:48:14.900097 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.001042 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.102142 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.203354 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.304227 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.405087 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.505962 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.607071 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.707974 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.808804 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:15 crc kubenswrapper[4786]: E0313 11:48:15.909425 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.010614 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.111691 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.212875 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.313567 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.413696 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.514322 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.615308 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.716380 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.817043 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:16 crc kubenswrapper[4786]: E0313 11:48:16.917904 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.018661 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.119652 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.220855 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.321013 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.421181 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: I0313 11:48:17.439938 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:17 crc kubenswrapper[4786]: I0313 11:48:17.441439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:17 crc kubenswrapper[4786]: I0313 11:48:17.441482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:17 crc kubenswrapper[4786]: I0313 11:48:17.441502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:17 crc kubenswrapper[4786]: I0313 11:48:17.442556 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.442857 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.521585 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.622337 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.723285 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.824215 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:17 crc kubenswrapper[4786]: E0313 11:48:17.924824 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.025953 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.126151 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.227103 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.328019 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.429219 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.529835 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.630270 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.731331 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.833105 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:18 crc kubenswrapper[4786]: E0313 11:48:18.933473 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.033809 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.078274 4786 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.137509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.137561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.137573 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.137597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.137612 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.240968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.241033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.241054 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.241081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.241106 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.345281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.345352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.345370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.345397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.345416 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.408754 4786 apiserver.go:52] "Watching apiserver" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.414486 4786 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.414838 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.415306 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.415478 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.415660 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.415709 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.416191 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.416215 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.416472 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.416628 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.416725 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.419452 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.420108 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.420603 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.420847 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.421076 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.421291 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.424263 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.424652 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.425145 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.448221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.448286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.448310 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.448342 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.448365 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.466307 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.478399 4786 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.484407 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.500151 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.528056 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.546013 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.551262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.551553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.551702 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.551819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.551952 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.569595 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.572710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.573215 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.573512 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.573924 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.574084 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.574600 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.574749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.573153 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.575002 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.573735 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.573748 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.574301 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.574538 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.575251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.575851 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576423 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576582 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576691 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576797 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576224 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576986 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576911 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576511 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577039 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577074 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.576847 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577174 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577121 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577280 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577302 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577410 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577435 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577459 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577508 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577533 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577579 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577611 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577680 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577711 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577726 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577739 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577756 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577776 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577817 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577851 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577877 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577942 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577979 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578016 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578089 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578122 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578156 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578234 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578271 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578307 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578342 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578409 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578484 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578559 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578608 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.577910 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578645 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578302 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578339 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578358 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578408 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578600 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578635 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578625 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578838 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578913 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579081 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579118 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579443 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579521 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579746 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579782 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.579849 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580152 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580628 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580705 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.578686 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580795 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580907 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580947 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580988 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581028 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581122 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581164 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581203 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581238 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581272 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581307 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581341 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581377 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581413 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581451 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581487 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.580794 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581559 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581593 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581626 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581662 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581695 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581735 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581771 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581839 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581874 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581936 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.581969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582040 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582074 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582112 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582143 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582179 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582213 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582247 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582281 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582353 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582388 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582422 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582455 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582489 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582558 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582590 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582624 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582657 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582692 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582727 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582761 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582796 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582830 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582868 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582935 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.582969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583041 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583075 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583111 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583146 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583181 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583216 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583252 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583288 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583324 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583359 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583468 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583501 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583534 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583570 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583607 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583643 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583706 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583780 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583816 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583854 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583913 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583948 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.583984 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584018 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584054 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584134 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584167 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584202 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584236 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584271 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584306 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584340 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584412 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584447 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584486 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584560 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584597 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584633 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584671 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584707 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584758 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584796 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584834 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584872 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585192 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585239 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585276 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585316 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585377 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585414 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585452 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585490 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585530 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585568 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585608 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585646 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585685 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585816 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585858 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585942 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585981 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586020 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586056 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586094 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586132 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586170 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586206 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586246 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586325 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586441 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586480 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586518 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586584 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586629 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586709 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586848 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586911 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586954 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587043 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587135 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587172 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587288 4786 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587314 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587338 4786 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587361 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587384 4786 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587408 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587431 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587452 4786 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587474 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587496 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587519 4786 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587540 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587562 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587584 4786 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587607 4786 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587628 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587670 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587691 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587714 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587738 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587760 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587783 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587805 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587827 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587849 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587871 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587931 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587956 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587979 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588001 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588022 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588046 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588067 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588088 4786 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588109 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588131 4786 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588151 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588172 4786 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588192 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588214 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588237 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588262 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588283 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584513 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584665 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.584914 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585025 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585331 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585346 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585649 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.585918 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586244 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586909 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586874 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588641 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.586962 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587206 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587377 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587681 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587754 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.587464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588027 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588295 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588330 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588403 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.588448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.589023 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.589200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.589273 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.589419 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.589586 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.590091 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.590329 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.590390 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:20.090369054 +0000 UTC m=+87.370022571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.590604 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.590761 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.590663 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.590814 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.590987 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.591020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.591454 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.591597 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.592214 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.592287 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:20.092267953 +0000 UTC m=+87.371921490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.592333 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.592371 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.592651 4786 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.593425 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.597962 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598198 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598381 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598729 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598812 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598957 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599111 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599138 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598563 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599557 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599734 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599778 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.599890 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.600198 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.600249 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:48:20.100224566 +0000 UTC m=+87.379878023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.600270 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.600289 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.600492 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.601181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.601688 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.603099 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.603628 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.607055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.608676 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.608828 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.608945 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.609083 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:20.109063942 +0000 UTC m=+87.388717399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.609104 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.609692 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.610028 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.610135 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.610387 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.610507 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.610908 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.611321 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.611679 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.612236 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.612562 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.598420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.615247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.615698 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.615760 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.619533 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.619557 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.619570 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:19 crc kubenswrapper[4786]: E0313 11:48:19.619789 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:20.119770126 +0000 UTC m=+87.399423673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.620075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.620122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.620431 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.620714 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.620797 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.620749 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.621073 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.621282 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.621325 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.621368 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.621399 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.622984 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.623247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.623531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.623843 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624123 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624270 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624497 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624925 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624938 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624967 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.624963 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.625305 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.625415 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.625684 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.626020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.626089 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.626745 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.627272 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.627696 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.627825 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.627994 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.626153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.630568 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.631371 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.631391 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.631655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.631850 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632067 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632271 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632318 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632168 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632410 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632732 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.632789 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.634403 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.634453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.634871 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.634960 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.635025 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.635103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.635562 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.635697 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.635766 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.636235 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.636579 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.637110 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638104 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638218 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638414 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638828 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638443 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638602 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638667 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.638731 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.639463 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.639778 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.639869 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.639934 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.640191 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.642434 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.650637 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.655669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.655717 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.655731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.655748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.655761 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.656754 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.658128 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.665273 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689694 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689710 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689725 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689740 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689753 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689765 4786 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689765 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689777 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689929 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689962 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.689986 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690010 4786 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690036 4786 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690059 4786 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690083 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690105 4786 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690131 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690155 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690178 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690197 4786 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690220 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690244 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690270 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690293 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690316 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690340 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690364 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690388 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690413 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690435 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690461 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690484 4786 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690512 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690539 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690564 4786 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690587 4786 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690610 4786 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690633 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690656 4786 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690683 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690706 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690728 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690751 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690777 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690801 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690825 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690848 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690872 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690938 4786 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690962 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.690985 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691010 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691033 4786 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691057 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691075 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691091 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691108 4786 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691126 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691142 4786 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691159 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691176 4786 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691193 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691213 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691231 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691247 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691263 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691280 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691296 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691312 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691329 4786 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691345 4786 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691362 4786 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691379 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691396 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691412 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691429 4786 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691446 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691463 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691482 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691500 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691516 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691533 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691554 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691570 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691618 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691638 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691655 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691673 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691691 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691708 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691726 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691742 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691761 4786 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691778 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691795 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691811 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691827 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691844 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691862 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691904 4786 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691923 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691940 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691957 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691973 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.691990 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692007 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692025 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692042 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692059 4786 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692076 4786 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692092 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692111 4786 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692128 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692145 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692163 4786 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692180 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692198 4786 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692222 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692239 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692256 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692273 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692290 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692307 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692325 4786 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692343 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692360 4786 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692377 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692393 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692411 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692428 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692445 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692461 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692478 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692495 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692512 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692528 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692545 4786 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692564 4786 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692581 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692598 4786 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692662 4786 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692681 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692699 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692716 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692734 4786 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692752 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692771 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692790 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692807 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692825 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692842 4786 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.692858 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.753062 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.758536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.758613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.758638 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.758665 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.758682 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.763096 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.772616 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.782014 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a93c07e45d7db1b895ca6328ae8cdda3d054f517b0bd3b748d22a0efe7725fef"} Mar 13 11:48:19 crc kubenswrapper[4786]: W0313 11:48:19.784032 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-031121a9aef3b81d71916f7aed0bf34a744e285115ab42a52ff1a2f06224bd06 WatchSource:0}: Error finding container 031121a9aef3b81d71916f7aed0bf34a744e285115ab42a52ff1a2f06224bd06: Status 404 returned error can't find the container with id 031121a9aef3b81d71916f7aed0bf34a744e285115ab42a52ff1a2f06224bd06 Mar 13 11:48:19 crc kubenswrapper[4786]: W0313 11:48:19.792169 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-7b2c1db83fd144344b73972012fe4fd8dcf439642e90a0a240b355a182f175fd WatchSource:0}: Error finding container 7b2c1db83fd144344b73972012fe4fd8dcf439642e90a0a240b355a182f175fd: Status 404 returned error can't find the container with id 7b2c1db83fd144344b73972012fe4fd8dcf439642e90a0a240b355a182f175fd Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.861312 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.861348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.861380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.861425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.861437 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.963460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.963495 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.963504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.963518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:19 crc kubenswrapper[4786]: I0313 11:48:19.963527 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:19Z","lastTransitionTime":"2026-03-13T11:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.066494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.066533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.066550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.066573 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.066588 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.096518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.096721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.096820 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.096985 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:21.096852292 +0000 UTC m=+88.376505739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.097445 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.097530 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:21.097504648 +0000 UTC m=+88.377158135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.169577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.169787 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.169931 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.170030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.170334 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.197871 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.198035 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:48:21.198006308 +0000 UTC m=+88.477659765 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.198633 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.199085 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.199025 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.199608 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.199763 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.200076 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:21.20004587 +0000 UTC m=+88.479699347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.199417 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.200926 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.201108 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.201308 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:21.201287561 +0000 UTC m=+88.480941038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.273607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.273872 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.274050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.274199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.274342 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.376599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.376656 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.376675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.376698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.376717 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.479670 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.479981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.480059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.480142 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.480207 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.582769 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.582839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.582856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.582918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.582937 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.595056 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.595136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.595154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.595179 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.595197 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.612368 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.617496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.617557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.617574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.617602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.617637 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.635136 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.644037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.644065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.644073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.644085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.644095 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.661854 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.667659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.667723 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.667741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.667765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.667788 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.684633 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.689168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.689241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.689255 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.689284 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.689302 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.703363 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: E0313 11:48:20.703586 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.705567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.705616 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.705646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.705667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.705687 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.786515 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"031121a9aef3b81d71916f7aed0bf34a744e285115ab42a52ff1a2f06224bd06"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.788710 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.790807 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.790838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.790850 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7b2c1db83fd144344b73972012fe4fd8dcf439642e90a0a240b355a182f175fd"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.802939 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.808027 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.808082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.808099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.808122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.808139 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.821281 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.835975 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.851866 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.871183 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.883528 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.899969 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.910356 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.910423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.910436 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.910465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.910478 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:20Z","lastTransitionTime":"2026-03-13T11:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.916537 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.929122 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.940929 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.959581 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:20 crc kubenswrapper[4786]: I0313 11:48:20.977839 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.013582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.013644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.013667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.013697 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.013720 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.107180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.107257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.107364 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.107473 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:23.107448506 +0000 UTC m=+90.387102043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.107371 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.107563 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:23.107533538 +0000 UTC m=+90.387187005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.116785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.117528 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.117561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.117598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.117625 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.207666 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.207811 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.207857 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.207950 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:48:23.207915185 +0000 UTC m=+90.487568662 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208098 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208127 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208181 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208201 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208141 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208314 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208279 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:23.208254663 +0000 UTC m=+90.487908150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.208398 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:23.208370166 +0000 UTC m=+90.488023653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.220695 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.220755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.220768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.220791 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.220806 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.323132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.323207 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.323231 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.323263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.323285 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.426121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.426202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.426230 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.426262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.426287 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.439989 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.440130 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.439995 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.440232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.440241 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:21 crc kubenswrapper[4786]: E0313 11:48:21.440417 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.445208 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.446368 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.448735 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.450053 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.452190 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.453494 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.454790 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.458152 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.461584 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.463301 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.464402 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.466593 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.467604 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.469541 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.473583 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.475346 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.476720 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.478591 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.479828 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.482072 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.483411 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.485096 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.486844 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.488336 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.490363 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.492541 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.494591 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.495338 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.496733 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.497445 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.498090 4786 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.498225 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.501110 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.502102 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.503537 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.505808 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.506877 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.508638 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.510193 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.514162 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.515755 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.517364 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.519763 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.522014 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.524233 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.526316 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.527490 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.530048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.530099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.530116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.530141 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.530159 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.530496 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.531730 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.533583 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.534728 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.535963 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.538171 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.540483 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.633433 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.633483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.633501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.633524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.633541 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.736255 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.736293 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.736304 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.736321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.736334 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.839140 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.839205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.839219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.839359 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.839377 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.941838 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.941870 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.941899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.941917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:21 crc kubenswrapper[4786]: I0313 11:48:21.941930 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:21Z","lastTransitionTime":"2026-03-13T11:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.044672 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.044762 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.044799 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.044820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.044836 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.147621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.147679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.147690 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.147705 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.147738 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.250918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.250961 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.251003 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.251021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.251032 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.354372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.354420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.354436 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.354459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.354497 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.457259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.457305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.457313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.457328 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.457337 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.559759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.559801 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.559812 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.559828 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.559838 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.663035 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.663085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.663100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.663122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.663137 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.766401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.766456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.766472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.766496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.766512 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.868764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.868810 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.868820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.868832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.868841 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.972037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.972092 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.972102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.972126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:22 crc kubenswrapper[4786]: I0313 11:48:22.972139 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:22Z","lastTransitionTime":"2026-03-13T11:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.075650 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.075724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.075738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.075759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.075772 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.131702 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.131746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.131813 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.131862 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:27.131849717 +0000 UTC m=+94.411503164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.132051 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.132217 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:27.132181185 +0000 UTC m=+94.411834672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.178673 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.178714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.178724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.178758 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.178770 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.232805 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233054 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:48:27.233025993 +0000 UTC m=+94.512679490 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.233210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233446 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233467 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233468 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233476 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233493 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233511 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233517 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:27.233507316 +0000 UTC m=+94.513160763 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.233602 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:27.233586868 +0000 UTC m=+94.513240345 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.233666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.282552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.282649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.282669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.282737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.282755 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.386579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.386636 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.386651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.386672 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.386685 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.439698 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.439791 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.439839 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.439936 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.440087 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:23 crc kubenswrapper[4786]: E0313 11:48:23.440438 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.454058 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.467697 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.480853 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.489922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.489969 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.489981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.490000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.490013 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.497956 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.512941 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.525649 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.592277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.592341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.592353 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.592372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.592384 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.695607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.695658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.695675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.695700 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.695721 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.798607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.798664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.798685 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.798708 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.798725 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.801024 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.838340 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.855084 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.873083 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.889057 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.903007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.903068 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.903082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.903106 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.903120 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:23Z","lastTransitionTime":"2026-03-13T11:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.903306 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:23 crc kubenswrapper[4786]: I0313 11:48:23.927652 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.005613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.005657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.005670 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.005686 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.005698 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.108078 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.108121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.108130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.108147 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.108156 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.211539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.211609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.211633 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.211664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.211686 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.313835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.313896 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.313912 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.313930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.313942 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.416617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.416682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.416700 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.416725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.416746 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.519944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.519997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.520013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.520037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.520053 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.623445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.623537 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.623555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.623591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.623609 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.726416 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.726484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.726502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.726525 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.726542 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.829647 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.829709 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.829726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.829752 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.829769 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.931839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.931903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.931917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.931932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:24 crc kubenswrapper[4786]: I0313 11:48:24.931943 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:24Z","lastTransitionTime":"2026-03-13T11:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.034806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.034877 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.034932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.034963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.034984 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.138086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.138133 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.138144 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.138163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.138179 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.240931 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.240979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.240997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.241022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.241042 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.343434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.343472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.343500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.343519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.343534 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.440080 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.440133 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.440150 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:25 crc kubenswrapper[4786]: E0313 11:48:25.440220 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:25 crc kubenswrapper[4786]: E0313 11:48:25.440321 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:25 crc kubenswrapper[4786]: E0313 11:48:25.440399 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.446127 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.446153 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.446161 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.446175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.446184 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.455816 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.549122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.549150 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.549158 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.549171 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.549181 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.652277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.652329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.652347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.652372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.652388 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.755653 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.755696 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.755714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.755737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.755756 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.858702 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.858757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.858773 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.858796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.858814 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.961184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.961614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.961632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.961650 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:25 crc kubenswrapper[4786]: I0313 11:48:25.961659 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:25Z","lastTransitionTime":"2026-03-13T11:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.064405 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.064782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.064984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.065133 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.065278 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.078021 4786 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.168830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.168914 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.168932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.168960 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.168977 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.272330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.272364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.272375 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.272391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.272400 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.507185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.507232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.507247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.507267 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.507283 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.609663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.609711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.609722 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.609740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.609752 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.712441 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.712484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.712495 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.712514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.712527 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.814723 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.814797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.814837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.814864 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.814916 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.917438 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.917489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.917502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.917518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:26 crc kubenswrapper[4786]: I0313 11:48:26.917530 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:26Z","lastTransitionTime":"2026-03-13T11:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.020135 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.020201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.020219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.020243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.020261 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.123481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.123539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.123557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.123580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.123598 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.168742 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.168869 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.168984 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.169014 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.169068 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:35.169045983 +0000 UTC m=+102.448699480 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.169131 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:35.169097745 +0000 UTC m=+102.448751232 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.226154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.226204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.226246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.226268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.226280 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.269318 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.269417 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269473 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:48:35.269451429 +0000 UTC m=+102.549104886 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.269503 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269570 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269599 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269606 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269619 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269619 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269636 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269682 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:35.269667215 +0000 UTC m=+102.549320682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.269707 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:35.269696135 +0000 UTC m=+102.549349592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.329856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.329960 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.329980 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.330008 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.330027 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.432836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.432945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.432968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.432997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.433021 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.440170 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.440186 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.440375 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.440426 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.440559 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:27 crc kubenswrapper[4786]: E0313 11:48:27.440657 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.535856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.535972 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.535999 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.536030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.536053 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.638215 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.638253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.638263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.638276 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.638286 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.740014 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.740049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.740059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.740073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.740083 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.842324 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.842361 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.842370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.842384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.842394 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.945245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.945291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.945349 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.945371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:27 crc kubenswrapper[4786]: I0313 11:48:27.945385 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:27Z","lastTransitionTime":"2026-03-13T11:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.048262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.048308 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.048320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.048340 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.048354 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.150231 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.150289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.150308 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.150333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.150351 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.252849 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.252950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.252976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.253005 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.253027 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.355394 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.355455 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.355472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.355496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.355517 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.451198 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.452041 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:48:28 crc kubenswrapper[4786]: E0313 11:48:28.452380 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.458120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.458186 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.458204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.458226 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.458242 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.561023 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.561088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.561107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.561130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.561147 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.663471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.663510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.663522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.663536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.663544 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.765660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.765723 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.765740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.765763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.765780 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.815556 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:48:28 crc kubenswrapper[4786]: E0313 11:48:28.815736 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.869441 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.869498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.869515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.869539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.869558 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.906972 4786 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.972751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.972807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.972824 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.972852 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:28 crc kubenswrapper[4786]: I0313 11:48:28.972869 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:28Z","lastTransitionTime":"2026-03-13T11:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.077371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.077485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.077505 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.077534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.077550 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.180467 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.180504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.180513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.180529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.180537 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.283572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.283651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.283676 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.283703 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.283722 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.385930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.385976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.385987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.386007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.386021 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.440546 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.440575 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:29 crc kubenswrapper[4786]: E0313 11:48:29.440681 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.440700 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:29 crc kubenswrapper[4786]: E0313 11:48:29.440767 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:29 crc kubenswrapper[4786]: E0313 11:48:29.441009 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.488418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.488471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.488488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.488516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.488533 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.591346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.591425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.591453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.591486 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.591511 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.695096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.695144 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.695154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.695174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.695187 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.798360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.798427 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.798440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.798462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.798474 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.900922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.901284 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.901484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.901689 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:29 crc kubenswrapper[4786]: I0313 11:48:29.901862 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:29Z","lastTransitionTime":"2026-03-13T11:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.006242 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.006321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.006342 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.006406 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.006427 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.109194 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.109330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.109351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.109376 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.109412 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.212609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.213125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.213289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.213460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.213662 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.316675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.316735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.316754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.316778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.316795 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.419657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.419740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.419763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.419792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.419814 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.458029 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.522485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.522544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.522561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.522588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.522605 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.625077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.625119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.625128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.625144 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.625154 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.727189 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.727254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.727278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.727307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.727327 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.830126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.830196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.830214 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.830238 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.830265 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.932923 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.932958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.932968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.932984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:30 crc kubenswrapper[4786]: I0313 11:48:30.932995 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:30Z","lastTransitionTime":"2026-03-13T11:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.034921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.034947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.034955 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.034967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.034976 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.067642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.067693 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.067710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.067731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.067746 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.081203 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.084655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.084679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.084690 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.084705 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.084717 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.096869 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.100014 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.100038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.100046 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.100058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.100067 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.113369 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.116874 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.116924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.116932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.116947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.116957 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.128801 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.131627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.131647 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.131655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.131668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.131678 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.144753 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.144912 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.146163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.146181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.146188 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.146200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.146209 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.248587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.248622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.248632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.248648 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.248660 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.351278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.351313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.351327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.351343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.351357 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.440653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.440947 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.441365 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.441520 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.441625 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:31 crc kubenswrapper[4786]: E0313 11:48:31.441733 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.453928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.454248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.454468 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.454818 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.455046 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.557729 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.557792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.557811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.557837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.557907 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.660488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.660546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.660563 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.660592 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.660610 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.765184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.765621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.765850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.766088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.766311 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.869509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.869813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.869975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.870096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.870232 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.973810 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.973940 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.973963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.974002 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:31 crc kubenswrapper[4786]: I0313 11:48:31.974044 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:31Z","lastTransitionTime":"2026-03-13T11:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.077438 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.077784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.078077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.078249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.078409 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.181355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.181494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.181528 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.181561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.181588 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.284175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.284586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.284747 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.284917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.285105 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.388391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.388851 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.389117 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.389337 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.389506 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.492556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.492803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.492819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.492843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.492860 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.595730 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.595855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.595911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.595937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.595954 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.699417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.699479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.699507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.699532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.699549 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.802629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.802693 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.802710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.802731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.802747 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.906067 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.906531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.906847 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.907111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:32 crc kubenswrapper[4786]: I0313 11:48:32.907165 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:32Z","lastTransitionTime":"2026-03-13T11:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.009737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.009800 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.009817 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.009842 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.009860 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.113605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.113954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.114100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.114280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.114436 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.219535 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.219622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.219646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.219674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.219702 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.322765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.322827 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.322845 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.322870 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.322914 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.426023 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.426076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.426093 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.426116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.426133 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.439858 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.439906 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.440104 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:33 crc kubenswrapper[4786]: E0313 11:48:33.441187 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:33 crc kubenswrapper[4786]: E0313 11:48:33.441750 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:33 crc kubenswrapper[4786]: E0313 11:48:33.442053 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.461473 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.481818 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.502441 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.525666 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.528872 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.528946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.528962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.528989 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.529006 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.543761 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.572873 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.589275 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.608692 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.623839 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.631776 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.631843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.631861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.631914 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.631935 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.734700 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.734759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.734775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.734799 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.734817 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.837675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.838231 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.838388 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.838526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.838663 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.945643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.946126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.946311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.946459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:33 crc kubenswrapper[4786]: I0313 11:48:33.946588 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:33Z","lastTransitionTime":"2026-03-13T11:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.049275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.049315 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.049335 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.049360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.049378 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.152416 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.152482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.152507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.152539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.152564 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.256827 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.257143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.257370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.257625 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.257922 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.361015 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.361082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.361099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.361126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.361143 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.464467 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.464541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.464566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.464593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.464616 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.568290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.568346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.568362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.568387 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.568405 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.673086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.673159 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.673199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.673238 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.673268 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.776320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.776404 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.776425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.776448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.776466 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.879183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.879241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.879258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.879280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.879297 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.982587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.982656 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.982679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.982709 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:34 crc kubenswrapper[4786]: I0313 11:48:34.982731 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:34Z","lastTransitionTime":"2026-03-13T11:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.087330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.087916 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.088109 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.088345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.088494 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.190196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.190327 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.190439 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.190506 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.190551 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:51.190519602 +0000 UTC m=+118.470173089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.190599 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:51.190565733 +0000 UTC m=+118.470219220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.192958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.193202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.193362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.193581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.194028 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.291077 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.291252 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:48:51.291214196 +0000 UTC m=+118.570867693 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.291724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.291767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.291987 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.291996 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.292013 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.292032 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.292039 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.292049 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.292099 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:51.292081968 +0000 UTC m=+118.571735445 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.292122 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:51.292110799 +0000 UTC m=+118.571764286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.297112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.297170 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.297193 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.297221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.297242 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.400656 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.401086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.401362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.401525 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.401661 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.439740 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.440095 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.440278 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.440332 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.440749 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:35 crc kubenswrapper[4786]: E0313 11:48:35.440812 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.504621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.504945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.505109 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.505265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.505404 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.607976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.608711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.608938 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.609114 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.609277 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.712811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.712867 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.712919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.712943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.712958 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.815752 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.815813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.815836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.815867 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.815927 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.918843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.918948 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.918966 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.918991 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:35 crc kubenswrapper[4786]: I0313 11:48:35.919008 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:35Z","lastTransitionTime":"2026-03-13T11:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.022557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.022626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.022651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.022680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.022702 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.126607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.126670 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.126687 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.126713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.126731 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.230383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.230450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.230474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.230507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.230528 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.332982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.333021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.333033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.333047 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.333058 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.436172 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.436233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.436268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.436291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.436307 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.538577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.538834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.538922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.539022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.539083 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.641856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.641941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.641962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.642168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.642188 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.744595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.744632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.744643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.744660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.744672 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.847179 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.847232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.847249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.847272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.847288 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.949216 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.949445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.949527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.949611 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:36 crc kubenswrapper[4786]: I0313 11:48:36.949692 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:36Z","lastTransitionTime":"2026-03-13T11:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.052983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.053335 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.053476 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.053618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.053756 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.156429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.156910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.157139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.157336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.157514 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.260119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.260510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.260735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.260946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.261093 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.363431 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.363617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.363798 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.363867 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.363961 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.441072 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:37 crc kubenswrapper[4786]: E0313 11:48:37.441528 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.442031 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:37 crc kubenswrapper[4786]: E0313 11:48:37.442256 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.447007 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:37 crc kubenswrapper[4786]: E0313 11:48:37.447235 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.455694 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rln62"] Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.456550 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.458548 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.459139 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.460065 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.466850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.467075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.467214 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.467344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.467466 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.472748 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.489135 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.505690 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.525719 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.542631 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.558171 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.569802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.569859 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.569876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.569945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.569963 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.573598 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.588339 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.609551 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.614739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5-hosts-file\") pod \"node-resolver-rln62\" (UID: \"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\") " pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.614791 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6h48\" (UniqueName: \"kubernetes.io/projected/c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5-kube-api-access-q6h48\") pod \"node-resolver-rln62\" (UID: \"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\") " pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.624454 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.671980 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.672033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.672049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.672073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.672089 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.715722 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5-hosts-file\") pod \"node-resolver-rln62\" (UID: \"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\") " pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.715779 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6h48\" (UniqueName: \"kubernetes.io/projected/c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5-kube-api-access-q6h48\") pod \"node-resolver-rln62\" (UID: \"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\") " pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.716184 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5-hosts-file\") pod \"node-resolver-rln62\" (UID: \"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\") " pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.742248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6h48\" (UniqueName: \"kubernetes.io/projected/c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5-kube-api-access-q6h48\") pod \"node-resolver-rln62\" (UID: \"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\") " pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.774532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.774582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.774594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.774610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.774623 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.777848 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rln62" Mar 13 11:48:37 crc kubenswrapper[4786]: W0313 11:48:37.794628 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50d3c3e_8ce8_4be6_9bf5_6c486c3a4df5.slice/crio-c07ece8ccdcdd9db6cadea9ce3db9257df9b0481233729de58c813e90035762b WatchSource:0}: Error finding container c07ece8ccdcdd9db6cadea9ce3db9257df9b0481233729de58c813e90035762b: Status 404 returned error can't find the container with id c07ece8ccdcdd9db6cadea9ce3db9257df9b0481233729de58c813e90035762b Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.824292 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8ncs8"] Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.825012 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6g54w"] Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.825940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.826592 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.830201 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-b5xwr"] Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.830691 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.830784 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b5xwr" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.833050 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.833265 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.833388 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.833693 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.833793 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.834195 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.834251 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.833318 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.834473 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.834204 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.834681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.846642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rln62" event={"ID":"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5","Type":"ContainerStarted","Data":"c07ece8ccdcdd9db6cadea9ce3db9257df9b0481233729de58c813e90035762b"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.849204 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.861588 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.875147 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.877572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.877599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.877610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.877627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.877640 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.891239 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.914988 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.937919 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.955325 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.973442 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.980006 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.980052 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.980065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.980082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.980095 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:37Z","lastTransitionTime":"2026-03-13T11:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:37 crc kubenswrapper[4786]: I0313 11:48:37.992381 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.008158 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.020996 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026473 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5ln\" (UniqueName: \"kubernetes.io/projected/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-kube-api-access-jp5ln\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cnibin\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-k8s-cni-cncf-io\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/75da9242-3ddf-4eca-82df-a5fc998b0fdc-rootfs\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026623 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-hostroot\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-kubelet\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026665 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-multus-certs\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-etc-kubernetes\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-os-release\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026729 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-conf-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026760 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-os-release\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026783 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-cni-bin\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026803 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-system-cni-dir\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cni-binary-copy\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026845 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-cni-binary-copy\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026866 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-netns\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026907 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026968 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-system-cni-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.026987 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-cni-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027006 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-cni-multus\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027026 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-cnibin\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027046 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-daemon-config\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027067 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75da9242-3ddf-4eca-82df-a5fc998b0fdc-proxy-tls\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027088 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75da9242-3ddf-4eca-82df-a5fc998b0fdc-mcd-auth-proxy-config\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027107 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9b4h\" (UniqueName: \"kubernetes.io/projected/75da9242-3ddf-4eca-82df-a5fc998b0fdc-kube-api-access-w9b4h\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027129 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-socket-dir-parent\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.027150 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-kube-api-access-hmgpf\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.033515 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.048712 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.059414 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.071112 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.083131 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.083181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.083192 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.083209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.083234 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.084207 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.098150 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.111231 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.120757 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.127713 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.127796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-cni-binary-copy\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.127839 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-netns\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.127964 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128006 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-system-cni-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128039 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-cni-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-netns\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128076 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-cni-multus\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128110 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-cnibin\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-daemon-config\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128174 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-system-cni-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128200 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-socket-dir-parent\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128244 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-kube-api-access-hmgpf\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128279 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75da9242-3ddf-4eca-82df-a5fc998b0fdc-proxy-tls\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75da9242-3ddf-4eca-82df-a5fc998b0fdc-mcd-auth-proxy-config\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128352 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9b4h\" (UniqueName: \"kubernetes.io/projected/75da9242-3ddf-4eca-82df-a5fc998b0fdc-kube-api-access-w9b4h\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128399 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128404 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5ln\" (UniqueName: \"kubernetes.io/projected/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-kube-api-access-jp5ln\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cnibin\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128512 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-k8s-cni-cncf-io\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128534 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/75da9242-3ddf-4eca-82df-a5fc998b0fdc-rootfs\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-hostroot\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128571 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-kubelet\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-multus-certs\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-etc-kubernetes\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128681 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-os-release\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128698 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-conf-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128691 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-socket-dir-parent\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128714 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-system-cni-dir\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-system-cni-dir\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128775 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cnibin\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128800 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-k8s-cni-cncf-io\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/75da9242-3ddf-4eca-82df-a5fc998b0fdc-rootfs\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-hostroot\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128868 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-kubelet\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-cni-multus\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128910 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-run-multus-certs\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128940 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-etc-kubernetes\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128941 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-cni-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-cni-binary-copy\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.128985 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-os-release\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129095 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cni-binary-copy\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129162 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-cnibin\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129231 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-conf-dir\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129350 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-os-release\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129484 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-cni-bin\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129516 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-os-release\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129583 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-host-var-lib-cni-bin\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129832 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-multus-daemon-config\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.129859 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75da9242-3ddf-4eca-82df-a5fc998b0fdc-mcd-auth-proxy-config\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.130019 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-cni-binary-copy\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.135754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75da9242-3ddf-4eca-82df-a5fc998b0fdc-proxy-tls\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.140106 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.149059 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5ln\" (UniqueName: \"kubernetes.io/projected/cd2e61d0-5deb-4005-85b4-c6f5ae70fe62-kube-api-access-jp5ln\") pod \"multus-b5xwr\" (UID: \"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\") " pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.152825 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/b3bf7ec5-4cc0-41b9-b916-f1797cbe149c-kube-api-access-hmgpf\") pod \"multus-additional-cni-plugins-6g54w\" (UID: \"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\") " pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.154070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9b4h\" (UniqueName: \"kubernetes.io/projected/75da9242-3ddf-4eca-82df-a5fc998b0fdc-kube-api-access-w9b4h\") pod \"machine-config-daemon-8ncs8\" (UID: \"75da9242-3ddf-4eca-82df-a5fc998b0fdc\") " pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.159690 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6g54w" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.164948 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.167834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.180082 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b5xwr" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.181592 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.186454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.186482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.186493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.186509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.186522 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.194639 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: W0313 11:48:38.206126 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd2e61d0_5deb_4005_85b4_c6f5ae70fe62.slice/crio-b7a8502d8d0fe83539bfb44dd62839f016e765faab60b64b4472ec17561c23b5 WatchSource:0}: Error finding container b7a8502d8d0fe83539bfb44dd62839f016e765faab60b64b4472ec17561c23b5: Status 404 returned error can't find the container with id b7a8502d8d0fe83539bfb44dd62839f016e765faab60b64b4472ec17561c23b5 Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.207990 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.223455 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.228360 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4z4th"] Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.230255 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.233371 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.233426 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.233944 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.233941 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.233994 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.234195 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.234329 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.245558 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.256453 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.266195 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.287378 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.289300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.289345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.289359 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.289379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.289396 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.317154 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.330579 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.330961 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-node-log\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331023 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-script-lib\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331059 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-bin\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331092 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-netd\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331126 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-systemd\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovn-node-metrics-cert\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331206 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331238 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-config\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331307 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-slash\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-var-lib-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331369 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-systemd-units\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331406 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-etc-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331454 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2cq4\" (UniqueName: \"kubernetes.io/projected/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-kube-api-access-g2cq4\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331492 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-ovn\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331572 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-netns\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331605 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-env-overrides\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331636 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-log-socket\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331672 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.331723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-kubelet\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.346312 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.360735 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.375603 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.391959 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.396337 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.396371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.396380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.396394 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.396403 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.404253 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.418096 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432447 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovn-node-metrics-cert\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432494 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432517 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-config\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-slash\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-var-lib-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432583 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-systemd-units\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432604 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-slash\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-etc-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-etc-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432652 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-var-lib-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-systemd-units\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432798 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2cq4\" (UniqueName: \"kubernetes.io/projected/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-kube-api-access-g2cq4\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.432968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-ovn\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-netns\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-env-overrides\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433100 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-log-socket\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433132 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433163 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-kubelet\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433235 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-node-log\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-script-lib\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433274 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-config\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433314 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-systemd\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433344 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-bin\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433356 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-log-socket\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433373 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-netd\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433447 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-netd\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-systemd\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433546 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-bin\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433586 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-kubelet\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433626 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-ovn\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-openvswitch\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-netns\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-node-log\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433957 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-script-lib\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.433984 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-env-overrides\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.435496 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.437132 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovn-node-metrics-cert\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.450042 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.457307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2cq4\" (UniqueName: \"kubernetes.io/projected/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-kube-api-access-g2cq4\") pod \"ovnkube-node-4z4th\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.500520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.500560 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.500570 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.500585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.500596 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.587242 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.603022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.603087 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.603113 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.603143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.603167 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: W0313 11:48:38.606427 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb3555e_af42_44e2_89e8_6f0a8d5d485c.slice/crio-d863342be7ec5832f97184ccdf598c1c1277c5dccece1a4c79732c4f8bbee786 WatchSource:0}: Error finding container d863342be7ec5832f97184ccdf598c1c1277c5dccece1a4c79732c4f8bbee786: Status 404 returned error can't find the container with id d863342be7ec5832f97184ccdf598c1c1277c5dccece1a4c79732c4f8bbee786 Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.705116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.705151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.705162 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.705178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.705189 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.808412 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.808487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.808510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.808546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.808572 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.851167 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" exitCode=0 Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.851273 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.851301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"d863342be7ec5832f97184ccdf598c1c1277c5dccece1a4c79732c4f8bbee786"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.855118 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerStarted","Data":"40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.855171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerStarted","Data":"b7a8502d8d0fe83539bfb44dd62839f016e765faab60b64b4472ec17561c23b5"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.858974 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.859015 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.859025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"133c17fcdfcd14549332b834e491ac08f23feed0176ed3ee1a184228d12f6759"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.862092 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf7ec5-4cc0-41b9-b916-f1797cbe149c" containerID="a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e" exitCode=0 Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.862152 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerDied","Data":"a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.862171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerStarted","Data":"d9002b98491ba98944d503fb0b1f8916f0b1b4591c1109edc3892da6b5f467ae"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.865622 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rln62" event={"ID":"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5","Type":"ContainerStarted","Data":"6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.880683 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.897632 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.911918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.911976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.911994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.912016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.912031 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:38Z","lastTransitionTime":"2026-03-13T11:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.923869 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.941441 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.959462 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.977262 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:38 crc kubenswrapper[4786]: I0313 11:48:38.994549 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.005967 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.019500 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.019794 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.019820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.019829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.019843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.019857 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.050847 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.065516 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.085728 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.098746 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.116250 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.123246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.123325 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.123344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.123369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.123387 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.136866 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.151132 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.167979 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.180367 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.192951 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.203536 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.213712 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.226176 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.228490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.228529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.228547 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.228564 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.228572 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.238926 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.252197 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.261789 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.273064 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.290395 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.301696 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.330165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.330205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.330217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.330234 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.330246 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.432409 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.432482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.432507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.432539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.432565 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.440340 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.440417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.440474 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:39 crc kubenswrapper[4786]: E0313 11:48:39.440655 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:39 crc kubenswrapper[4786]: E0313 11:48:39.440798 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:39 crc kubenswrapper[4786]: E0313 11:48:39.440945 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.534847 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.534903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.534917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.534935 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.534948 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.637249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.637303 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.637321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.637345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.637362 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.741553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.741640 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.741657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.741681 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.741701 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.845547 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.845598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.845615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.845638 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.845656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.873423 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerStarted","Data":"c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.881351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.881402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.881433 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.881513 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.889695 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.908046 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.927055 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.943446 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.948649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.948695 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.948704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.948718 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.948728 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:39Z","lastTransitionTime":"2026-03-13T11:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.961838 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.975706 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:39 crc kubenswrapper[4786]: I0313 11:48:39.990139 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.011822 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.025589 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.035962 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.051589 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.051644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.051659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.051675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.051688 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.052671 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.076638 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.090538 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.103016 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.154555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.154599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.154610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.154626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.154638 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.258215 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.258578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.258599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.258623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.258639 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.361463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.361533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.361551 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.361574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.361591 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.463846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.463898 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.463909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.463924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.463933 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.566803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.566842 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.566856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.566878 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.566929 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.668673 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.669016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.669025 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.669038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.669046 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.770721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.771521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.771623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.771706 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.771778 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.875626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.875689 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.875707 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.875735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.875755 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:40Z","lastTransitionTime":"2026-03-13T11:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.887107 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf7ec5-4cc0-41b9-b916-f1797cbe149c" containerID="c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a" exitCode=0 Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.887217 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerDied","Data":"c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.897376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.897445 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.906456 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.935814 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:40 crc kubenswrapper[4786]: I0313 11:48:40.952772 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.035731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.035840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.035875 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.035938 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.035950 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.035941 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.052208 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.064202 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.076228 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.102342 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.114100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.126075 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.139476 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.139960 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.139984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.139993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.140007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.140016 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.160333 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.175919 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.186592 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.241967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.242013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.242025 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.242041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.242054 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.251053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.251099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.251115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.251138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.251156 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.262620 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.266112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.266143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.266152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.266165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.266175 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.279039 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.282487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.282524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.282533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.282548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.282560 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.293293 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.296461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.296489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.296498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.296512 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.296522 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.307105 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.309604 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.309642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.309651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.309665 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.309675 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.321086 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.321192 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.344569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.344622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.344637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.344655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.344669 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.439913 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.439953 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.439958 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.440088 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.440194 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:41 crc kubenswrapper[4786]: E0313 11:48:41.440335 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.440944 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.446211 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.446265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.446277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.446294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.446305 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.549235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.549268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.549278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.549294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.549304 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.652089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.652114 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.652121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.652134 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.652142 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.755379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.755413 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.755424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.755440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.755451 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.857410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.857445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.857455 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.857470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.857480 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.902735 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.904128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.908143 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.922099 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf7ec5-4cc0-41b9-b916-f1797cbe149c" containerID="b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e" exitCode=0 Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.922153 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerDied","Data":"b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.942739 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.960941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.960990 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.961012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.961045 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.961069 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:41Z","lastTransitionTime":"2026-03-13T11:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.962454 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.977905 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:41 crc kubenswrapper[4786]: I0313 11:48:41.991001 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.004656 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.016785 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.028669 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.042745 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.055184 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.063668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.063702 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.063713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.063732 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.063744 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.070283 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.086033 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.095588 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.105598 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.121997 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.134198 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.148660 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.165504 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.166782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.166815 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.166826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.166843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.166854 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.181917 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.197874 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.208867 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.236218 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.256829 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.268772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.268805 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.268815 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.268838 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.268851 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.274011 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.286720 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.301680 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.313406 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.334536 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.345324 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.372110 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.372149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.372160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.372176 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.372187 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.475167 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.475222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.475243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.475271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.475292 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.578121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.578168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.578184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.578207 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.578223 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.684006 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.684082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.684105 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.684138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.684160 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.786539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.786622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.786649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.786684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.786708 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.889975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.890055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.890081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.890112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.890137 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.938172 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.941373 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf7ec5-4cc0-41b9-b916-f1797cbe149c" containerID="118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9" exitCode=0 Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.941675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerDied","Data":"118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.970845 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.994094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.994124 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.994132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.994148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.994157 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:42Z","lastTransitionTime":"2026-03-13T11:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:42 crc kubenswrapper[4786]: I0313 11:48:42.995236 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.007338 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.021496 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.034689 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.048459 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.069534 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.082487 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.095420 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.097603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.097637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.097649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.097665 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.097674 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.106720 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.121714 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.137179 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.147977 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.157991 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.200381 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.200429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.200442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.200458 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.200471 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.304133 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.304203 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.304220 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.304251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.304274 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.408075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.408132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.408150 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.408174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.408192 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.439692 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.439725 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.439725 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:43 crc kubenswrapper[4786]: E0313 11:48:43.439867 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:43 crc kubenswrapper[4786]: E0313 11:48:43.440050 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:43 crc kubenswrapper[4786]: E0313 11:48:43.443034 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.457826 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.469624 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.487755 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.512031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.512095 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.512111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.512135 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.512152 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.518583 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.550940 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.570189 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.588121 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.606637 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.615299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.615352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.615370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.615415 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.615437 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.629741 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.653699 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.674228 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.694668 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.717774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.717845 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.717863 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.717914 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.717933 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.725340 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.751504 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.822333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.822385 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.822401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.822425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.822444 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.926033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.926118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.926137 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.926735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.926801 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:43Z","lastTransitionTime":"2026-03-13T11:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.949281 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerStarted","Data":"7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563"} Mar 13 11:48:43 crc kubenswrapper[4786]: I0313 11:48:43.986130 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.008688 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.030596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.030668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.030692 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.030720 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.030743 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.030227 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.052933 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.072294 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.129302 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.135509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.135580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.135614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.135643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.135661 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.144132 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.162966 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.183635 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.207110 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.224238 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lwrsl"] Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.224789 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.228325 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.228436 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.228592 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.230250 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.230575 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.239450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.239500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.239518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.239543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.239562 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.246224 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.261649 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.283373 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.294122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-serviceca\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.294196 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqgd4\" (UniqueName: \"kubernetes.io/projected/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-kube-api-access-pqgd4\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.294239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-host\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.305796 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.327875 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.338748 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.341232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.341271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.341281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.341295 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.341304 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.351039 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.361805 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.379273 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.393400 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.395588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqgd4\" (UniqueName: \"kubernetes.io/projected/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-kube-api-access-pqgd4\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.395631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-host\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.395659 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-serviceca\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.395714 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-host\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.396549 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-serviceca\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.404240 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.415125 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqgd4\" (UniqueName: \"kubernetes.io/projected/da7e4ae2-afe2-4408-921d-d9ecb7c8c803-kube-api-access-pqgd4\") pod \"node-ca-lwrsl\" (UID: \"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\") " pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.418654 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.437271 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.446916 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.446956 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.446968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.446993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.447006 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.449752 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.469752 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.482306 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.492826 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.507014 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.548381 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lwrsl" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.550017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.550044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.550054 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.550070 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.550080 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.651970 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.652247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.652258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.652273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.652283 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.755651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.755700 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.755709 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.755724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.755735 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.857708 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.857737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.857746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.857758 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.857767 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.955675 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf7ec5-4cc0-41b9-b916-f1797cbe149c" containerID="7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563" exitCode=0 Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.955763 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerDied","Data":"7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.958849 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lwrsl" event={"ID":"da7e4ae2-afe2-4408-921d-d9ecb7c8c803","Type":"ContainerStarted","Data":"3a5645567f66699027a57335e92a9d2ae4ebed227c753a92d04867ab9ffe027d"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.960243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.960277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.960293 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.960314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.960329 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:44Z","lastTransitionTime":"2026-03-13T11:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:44 crc kubenswrapper[4786]: I0313 11:48:44.977617 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:44Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.002476 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.014294 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.027079 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.035810 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.046302 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.055285 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.063015 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.063098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.063119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.063180 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.063199 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.065954 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.085141 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.097916 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.109305 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.127581 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.146155 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.163188 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.165178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.165221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.165230 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.165245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.165254 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.174688 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.267543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.267580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.267588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.267601 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.267610 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.369910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.369971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.369987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.370009 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.370026 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.440422 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:45 crc kubenswrapper[4786]: E0313 11:48:45.440632 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.441034 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:45 crc kubenswrapper[4786]: E0313 11:48:45.441189 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.441733 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:45 crc kubenswrapper[4786]: E0313 11:48:45.441869 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.472581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.472645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.472664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.472687 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.472703 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.576617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.576672 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.576698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.576721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.576740 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.680576 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.680646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.680683 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.680714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.680741 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.784725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.785091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.785103 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.785118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.785128 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.889205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.889271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.889289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.889314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.889333 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.966316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lwrsl" event={"ID":"da7e4ae2-afe2-4408-921d-d9ecb7c8c803","Type":"ContainerStarted","Data":"fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.975656 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.976450 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.976500 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.976521 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.982506 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf7ec5-4cc0-41b9-b916-f1797cbe149c" containerID="7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4" exitCode=0 Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.982597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerDied","Data":"7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4"} Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.987363 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.996594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.996646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.996664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.996691 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:45 crc kubenswrapper[4786]: I0313 11:48:45.996707 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:45Z","lastTransitionTime":"2026-03-13T11:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.006597 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.016615 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.022048 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.043949 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.060627 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.082705 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.104306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.104347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.104360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.104381 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.104397 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.118710 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.133185 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.145367 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.158488 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.172941 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.187200 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.205501 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.207690 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.207719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.207731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.207748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.207760 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.224763 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.240991 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.257291 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.289519 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.304548 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.309615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.309646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.309654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.309666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.309675 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.321806 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.337906 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.358282 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.378951 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.391630 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.407281 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.411740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.411766 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.411775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.411788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.411797 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.427685 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.444610 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.458840 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.474846 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.495190 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.516258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.516313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.516331 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.516355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.516374 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.525404 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.539807 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.619091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.619158 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.619175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.619199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.619216 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.721774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.722083 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.722115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.722142 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.722161 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.832550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.832620 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.832639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.832664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.832689 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.936138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.936169 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.936177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.936191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.936199 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:46Z","lastTransitionTime":"2026-03-13T11:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:46 crc kubenswrapper[4786]: I0313 11:48:46.990652 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" event={"ID":"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c","Type":"ContainerStarted","Data":"502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.015001 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.029483 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.039979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.040326 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.040379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.040454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.040565 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.046466 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.066521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.090577 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.110357 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.124447 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.137538 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.143035 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.143085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.143102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.143123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.143139 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.167322 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.185155 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.215454 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.234372 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.247832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.247906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.247924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.247945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.247961 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.253409 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.274531 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.295517 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.350862 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.350947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.350966 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.350987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.351004 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.439784 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.439931 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:47 crc kubenswrapper[4786]: E0313 11:48:47.439962 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.440006 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:47 crc kubenswrapper[4786]: E0313 11:48:47.440114 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:47 crc kubenswrapper[4786]: E0313 11:48:47.440172 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.452828 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.452863 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.452871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.452904 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.452916 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.555108 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.555151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.555160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.555174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.555183 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.657627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.657666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.657675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.657690 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.657698 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.761327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.761367 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.761378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.761393 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.761402 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.872921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.872969 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.872981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.872999 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.873011 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.977500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.977548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.977561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.977579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:47 crc kubenswrapper[4786]: I0313 11:48:47.977591 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:47Z","lastTransitionTime":"2026-03-13T11:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.080430 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.080651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.080662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.080679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.080688 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.184168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.184226 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.184243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.184265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.184282 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.286805 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.286854 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.286868 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.286907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.286919 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.390055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.390111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.390133 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.390162 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.390185 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.494088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.494151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.494168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.494194 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.494215 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.597383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.597450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.597474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.597504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.597526 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.700395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.700453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.700496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.700520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.700537 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.803574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.803617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.803633 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.803657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.803675 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.907073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.907134 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.907178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.907210 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:48 crc kubenswrapper[4786]: I0313 11:48:48.907234 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:48Z","lastTransitionTime":"2026-03-13T11:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:48.999939 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/0.log" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.004182 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131" exitCode=1 Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.004281 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.005334 4786 scope.go:117] "RemoveContainer" containerID="e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.010517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.010557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.010573 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.010595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.010612 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.028813 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.053003 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.069615 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.090918 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.111765 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.114242 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.114286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.114301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.114322 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.114337 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.132695 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.146363 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.159240 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.175250 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.199341 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:48Z\\\",\\\"message\\\":\\\" 6640 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:48.070271 6640 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:48.070324 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:48.070337 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:48.070379 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:48.070382 6640 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:48.070420 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:48.070431 6640 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:48:48.070445 6640 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:48.070460 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:48.070471 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:48:48.070491 6640 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:48:48.070492 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:48:48.070517 6640 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:48:48.070553 6640 factory.go:656] Stopping watch factory\\\\nI0313 11:48:48.070577 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.213032 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.216832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.216869 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.216883 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.216920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.216934 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.245735 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.262919 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.278532 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.292828 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:49Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.320320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.320368 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.320379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.320396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.320408 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.423797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.423956 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.423976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.424000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.424016 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.439595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:49 crc kubenswrapper[4786]: E0313 11:48:49.439771 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.439554 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:49 crc kubenswrapper[4786]: E0313 11:48:49.439921 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.439608 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:49 crc kubenswrapper[4786]: E0313 11:48:49.440019 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.527024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.527076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.527087 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.527105 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.527117 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.630633 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.630770 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.630792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.630817 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.630877 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.733599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.733673 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.733698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.733728 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.733750 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.837278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.837336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.837353 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.837382 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.837404 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.944118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.944165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.944181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.944205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:49 crc kubenswrapper[4786]: I0313 11:48:49.944222 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:49Z","lastTransitionTime":"2026-03-13T11:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.029375 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/0.log" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.032912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.033363 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.046771 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.046819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.046855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.046874 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.046898 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.070080 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.086454 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.098671 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.111031 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.123944 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.136928 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.145637 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.149223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.149254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.149264 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.149281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.149293 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.158715 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.172639 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.224153 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.244447 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.252081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.252112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.252120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.252132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.252141 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.256414 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8"] Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.257288 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.259587 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.264088 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.265270 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.285294 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.323251 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:48Z\\\",\\\"message\\\":\\\" 6640 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:48.070271 6640 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:48.070324 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:48.070337 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:48.070379 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:48.070382 6640 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:48.070420 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:48.070431 6640 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:48:48.070445 6640 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:48.070460 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:48.070471 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:48:48.070491 6640 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:48:48.070492 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:48:48.070517 6640 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:48:48.070553 6640 factory.go:656] Stopping watch factory\\\\nI0313 11:48:48.070577 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.335287 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.349897 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.354873 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.354911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.354919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.354930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.354939 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.363146 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.378314 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.380765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht7h\" (UniqueName: \"kubernetes.io/projected/97b994ad-2b42-41b1-9976-bfe949acbc91-kube-api-access-vht7h\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.380797 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97b994ad-2b42-41b1-9976-bfe949acbc91-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.380823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97b994ad-2b42-41b1-9976-bfe949acbc91-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.380848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97b994ad-2b42-41b1-9976-bfe949acbc91-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.398755 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:48Z\\\",\\\"message\\\":\\\" 6640 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:48.070271 6640 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:48.070324 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:48.070337 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:48.070379 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:48.070382 6640 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:48.070420 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:48.070431 6640 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:48:48.070445 6640 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:48.070460 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:48.070471 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:48:48.070491 6640 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:48:48.070492 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:48:48.070517 6640 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:48:48.070553 6640 factory.go:656] Stopping watch factory\\\\nI0313 11:48:48.070577 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.412490 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.427229 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.457383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.457440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.457458 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.457481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.457497 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.457668 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.474118 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.481799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97b994ad-2b42-41b1-9976-bfe949acbc91-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.481846 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vht7h\" (UniqueName: \"kubernetes.io/projected/97b994ad-2b42-41b1-9976-bfe949acbc91-kube-api-access-vht7h\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.481867 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97b994ad-2b42-41b1-9976-bfe949acbc91-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.481909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97b994ad-2b42-41b1-9976-bfe949acbc91-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.482445 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97b994ad-2b42-41b1-9976-bfe949acbc91-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.483755 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97b994ad-2b42-41b1-9976-bfe949acbc91-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.498254 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.499967 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97b994ad-2b42-41b1-9976-bfe949acbc91-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.502875 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht7h\" (UniqueName: \"kubernetes.io/projected/97b994ad-2b42-41b1-9976-bfe949acbc91-kube-api-access-vht7h\") pod \"ovnkube-control-plane-749d76644c-rz9x8\" (UID: \"97b994ad-2b42-41b1-9976-bfe949acbc91\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.515049 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.529529 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.550267 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.561876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.561941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.561954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.561982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.561995 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.570722 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.580497 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.595470 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.628051 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.646396 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:50Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.665992 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.666026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.666035 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.666050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.666060 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.768735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.768778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.768791 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.768810 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.768825 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.873121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.873203 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.873228 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.873254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.873272 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.976464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.976527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.976546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.976571 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:50 crc kubenswrapper[4786]: I0313 11:48:50.976588 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:50Z","lastTransitionTime":"2026-03-13T11:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.013470 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g4pzt"] Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.014568 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.014812 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.030430 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.039941 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/1.log" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.040744 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/0.log" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.046040 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369" exitCode=1 Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.046198 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.046301 4786 scope.go:117] "RemoveContainer" containerID="e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.047196 4786 scope.go:117] "RemoveContainer" containerID="6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.047450 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.048951 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" event={"ID":"97b994ad-2b42-41b1-9976-bfe949acbc91","Type":"ContainerStarted","Data":"11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.048997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" event={"ID":"97b994ad-2b42-41b1-9976-bfe949acbc91","Type":"ContainerStarted","Data":"0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.049024 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" event={"ID":"97b994ad-2b42-41b1-9976-bfe949acbc91","Type":"ContainerStarted","Data":"463a09eab798f6615e0a144b6c9f7f4c876198fb500e896455ca15381889133f"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.054454 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.072040 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.078675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.078731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.078742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.078764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.078776 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.088119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjgjb\" (UniqueName: \"kubernetes.io/projected/c19009bf-0d5a-458f-8c3e-97bc203741b1-kube-api-access-bjgjb\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.088199 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.092922 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.113000 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.127315 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.148729 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:48Z\\\",\\\"message\\\":\\\" 6640 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:48.070271 6640 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:48.070324 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:48.070337 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:48.070379 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:48.070382 6640 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:48.070420 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:48.070431 6640 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:48:48.070445 6640 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:48.070460 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:48.070471 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:48:48.070491 6640 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:48:48.070492 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:48:48.070517 6640 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:48:48.070553 6640 factory.go:656] Stopping watch factory\\\\nI0313 11:48:48.070577 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.158051 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.168142 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.176816 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.180543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.180575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.180584 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.180599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.180608 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.187038 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.189300 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.189443 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.189488 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:51.689473945 +0000 UTC m=+118.969127382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.189758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjgjb\" (UniqueName: \"kubernetes.io/projected/c19009bf-0d5a-458f-8c3e-97bc203741b1-kube-api-access-bjgjb\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.196897 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.206565 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjgjb\" (UniqueName: \"kubernetes.io/projected/c19009bf-0d5a-458f-8c3e-97bc203741b1-kube-api-access-bjgjb\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.209101 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.219439 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.234664 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.251636 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.264332 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.282761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.282792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.282801 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.282813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.282822 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.283946 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.290435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.290492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.290586 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.290604 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.290634 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:23.290622111 +0000 UTC m=+150.570275558 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.290680 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:23.290660452 +0000 UTC m=+150.570313899 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.300315 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.313432 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.324619 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.339822 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.353115 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.365174 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.377936 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.385210 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.385451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.385473 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.385504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.385527 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.391625 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.391720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.391702 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.391789 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.391946 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.391966 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.391977 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:23.391933141 +0000 UTC m=+150.671586648 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.391999 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.392042 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.391972 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.392101 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.392119 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:23.392094915 +0000 UTC m=+150.671748402 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.392187 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:23.392164277 +0000 UTC m=+150.671817844 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.406853 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.416463 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.429620 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.439761 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.439866 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.439955 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.440013 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.440173 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.440280 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.462191 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9cb65bde20e5e72698ea0c6880ede56a46dea453fe50c4b640a3b7a987ca131\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:48Z\\\",\\\"message\\\":\\\" 6640 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:48.070271 6640 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:48.070324 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:48.070337 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:48.070379 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:48.070382 6640 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:48.070420 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:48.070431 6640 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:48:48.070445 6640 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:48.070460 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:48.070471 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:48:48.070491 6640 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:48:48.070492 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:48:48.070517 6640 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:48:48.070553 6640 factory.go:656] Stopping watch factory\\\\nI0313 11:48:48.070577 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.477194 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.488830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.488921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.488942 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.488968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.488986 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.501287 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.517661 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.534130 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.546418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.546451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.546478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.546494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.546503 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.557682 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.561548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.561579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.561607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.561621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.561630 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.578649 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.586351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.586407 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.586417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.586432 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.586443 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.598808 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.603212 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.603243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.603252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.603266 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.603292 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.618927 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.623485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.623514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.623522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.623534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.623556 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.639064 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:51Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.639278 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.641173 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.641225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.641243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.641269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.641287 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.695042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.695339 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: E0313 11:48:51.695448 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:52.695420329 +0000 UTC m=+119.975073836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.743368 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.743424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.743440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.743464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.743482 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.848319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.848403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.848428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.848459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.848491 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.951947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.952006 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.952051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.952077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:51 crc kubenswrapper[4786]: I0313 11:48:51.952094 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:51Z","lastTransitionTime":"2026-03-13T11:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.055006 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.055066 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.056060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.056134 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.056160 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.057118 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/1.log" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.063696 4786 scope.go:117] "RemoveContainer" containerID="6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369" Mar 13 11:48:52 crc kubenswrapper[4786]: E0313 11:48:52.064011 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.082403 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.109426 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.128848 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.143018 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.155374 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.159581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.159627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.159644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.159668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.159684 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.187592 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.202824 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.234320 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.252804 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.262998 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.263073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.263100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.263131 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.263155 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.273543 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.294776 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.315565 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.336418 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.353040 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.366569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.366628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.366652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.366682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.366704 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.369974 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.391618 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.413319 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:52Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.439953 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:52 crc kubenswrapper[4786]: E0313 11:48:52.440122 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.469588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.469652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.469669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.469696 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.469715 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.573358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.573421 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.573438 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.573461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.573479 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.677246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.677315 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.677332 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.677359 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.677375 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.709917 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:52 crc kubenswrapper[4786]: E0313 11:48:52.710085 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:52 crc kubenswrapper[4786]: E0313 11:48:52.710182 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:54.710157329 +0000 UTC m=+121.989810806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.780789 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.780850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.780870 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.780938 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.780956 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.883796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.883861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.883951 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.883979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.883996 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.987386 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.987443 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.987459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.987482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:52 crc kubenswrapper[4786]: I0313 11:48:52.987502 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:52Z","lastTransitionTime":"2026-03-13T11:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.090282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.090342 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.090360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.090384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.090407 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:53Z","lastTransitionTime":"2026-03-13T11:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.193997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.194060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.194079 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.194100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.194116 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:53Z","lastTransitionTime":"2026-03-13T11:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.296555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.296632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.296652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.296678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.296697 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:48:53Z","lastTransitionTime":"2026-03-13T11:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:48:53 crc kubenswrapper[4786]: E0313 11:48:53.396932 4786 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.439946 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.440019 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.439959 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:53 crc kubenswrapper[4786]: E0313 11:48:53.441304 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:53 crc kubenswrapper[4786]: E0313 11:48:53.441463 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:53 crc kubenswrapper[4786]: E0313 11:48:53.441198 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.474990 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.496749 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.517053 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.538630 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.561710 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.586668 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.603933 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.622239 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: E0313 11:48:53.626773 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.639463 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.662962 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.683176 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.697480 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.716975 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.746362 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.761500 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.776523 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:53 crc kubenswrapper[4786]: I0313 11:48:53.795668 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:53Z is after 2025-08-24T17:21:41Z" Mar 13 11:48:54 crc kubenswrapper[4786]: I0313 11:48:54.439572 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:54 crc kubenswrapper[4786]: E0313 11:48:54.441172 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:48:54 crc kubenswrapper[4786]: I0313 11:48:54.733990 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:54 crc kubenswrapper[4786]: E0313 11:48:54.734161 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:54 crc kubenswrapper[4786]: E0313 11:48:54.734241 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:48:58.734218742 +0000 UTC m=+126.013872219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:55 crc kubenswrapper[4786]: I0313 11:48:55.440006 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:55 crc kubenswrapper[4786]: I0313 11:48:55.440071 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:55 crc kubenswrapper[4786]: I0313 11:48:55.440192 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:55 crc kubenswrapper[4786]: E0313 11:48:55.440188 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:55 crc kubenswrapper[4786]: E0313 11:48:55.440328 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:55 crc kubenswrapper[4786]: E0313 11:48:55.440436 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:56 crc kubenswrapper[4786]: I0313 11:48:56.440325 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:56 crc kubenswrapper[4786]: E0313 11:48:56.440521 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:48:57 crc kubenswrapper[4786]: I0313 11:48:57.440077 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:57 crc kubenswrapper[4786]: I0313 11:48:57.440140 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:57 crc kubenswrapper[4786]: E0313 11:48:57.440197 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:48:57 crc kubenswrapper[4786]: E0313 11:48:57.440274 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:57 crc kubenswrapper[4786]: I0313 11:48:57.440327 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:57 crc kubenswrapper[4786]: E0313 11:48:57.440495 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:58 crc kubenswrapper[4786]: I0313 11:48:58.440061 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:58 crc kubenswrapper[4786]: E0313 11:48:58.440262 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:48:58 crc kubenswrapper[4786]: E0313 11:48:58.628565 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:48:58 crc kubenswrapper[4786]: I0313 11:48:58.784471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:48:58 crc kubenswrapper[4786]: E0313 11:48:58.784634 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:58 crc kubenswrapper[4786]: E0313 11:48:58.784703 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:06.784685468 +0000 UTC m=+134.064338925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:48:59 crc kubenswrapper[4786]: I0313 11:48:59.440622 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:48:59 crc kubenswrapper[4786]: E0313 11:48:59.440806 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:48:59 crc kubenswrapper[4786]: I0313 11:48:59.441135 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:48:59 crc kubenswrapper[4786]: I0313 11:48:59.441188 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:48:59 crc kubenswrapper[4786]: E0313 11:48:59.441284 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:48:59 crc kubenswrapper[4786]: E0313 11:48:59.441441 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:00 crc kubenswrapper[4786]: I0313 11:49:00.439725 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:00 crc kubenswrapper[4786]: E0313 11:49:00.439942 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:00 crc kubenswrapper[4786]: I0313 11:49:00.978042 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:00 crc kubenswrapper[4786]: I0313 11:49:00.999125 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.016531 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.035315 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.067167 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.084866 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.102701 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.117071 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.158565 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.200241 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.221679 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.235489 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.247689 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.261349 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.275261 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.287187 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.300784 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.317704 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.440292 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.440336 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.440346 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:01 crc kubenswrapper[4786]: E0313 11:49:01.440509 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:01 crc kubenswrapper[4786]: E0313 11:49:01.440915 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:01 crc kubenswrapper[4786]: E0313 11:49:01.441144 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.995759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.995803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.995813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.995830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:01 crc kubenswrapper[4786]: I0313 11:49:01.995843 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:01Z","lastTransitionTime":"2026-03-13T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.014649 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.018848 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.018939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.018961 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.018992 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.019014 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:02Z","lastTransitionTime":"2026-03-13T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.038765 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.043934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.044004 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.044021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.044050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.044068 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:02Z","lastTransitionTime":"2026-03-13T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.063306 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.068224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.068271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.068288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.068311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.068329 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:02Z","lastTransitionTime":"2026-03-13T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.083571 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.087941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.088037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.088059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.088081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.088098 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:02Z","lastTransitionTime":"2026-03-13T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.100723 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.100953 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.440444 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:02 crc kubenswrapper[4786]: E0313 11:49:02.440672 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:02 crc kubenswrapper[4786]: I0313 11:49:02.442105 4786 scope.go:117] "RemoveContainer" containerID="6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.108092 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/1.log" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.111075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884"} Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.111395 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.131673 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.147833 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.160546 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.192462 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.210972 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.226742 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.243611 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.266218 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.283687 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.300611 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.316134 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.339021 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.351421 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.366295 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.380804 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.394009 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.404521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.440167 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.440193 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.440254 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:03 crc kubenswrapper[4786]: E0313 11:49:03.440770 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:03 crc kubenswrapper[4786]: E0313 11:49:03.441043 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:03 crc kubenswrapper[4786]: E0313 11:49:03.441171 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.464562 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.481841 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.504733 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.520342 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.539977 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.559276 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.570670 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.580624 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.597767 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.609802 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.620097 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: E0313 11:49:03.630438 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.630869 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.662283 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.674586 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.687787 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.699908 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:03 crc kubenswrapper[4786]: I0313 11:49:03.716251 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:03Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.117996 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/2.log" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.118768 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/1.log" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.124315 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884" exitCode=1 Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.124369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884"} Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.124423 4786 scope.go:117] "RemoveContainer" containerID="6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.126029 4786 scope.go:117] "RemoveContainer" containerID="cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884" Mar 13 11:49:04 crc kubenswrapper[4786]: E0313 11:49:04.127123 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.149028 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.178048 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.205243 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.222319 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.244170 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.276459 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6182cfb6c232270425d716aae79f874ad0186f226495bcea65c34218c0b0d369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"message\\\":\\\"Egress Service node crc\\\\nI0313 11:48:50.605846 6835 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 640.396µs\\\\nI0313 11:48:50.608010 6835 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:48:50.611342 6835 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:48:50.617535 6835 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:48:50.617574 6835 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:48:50.617634 6835 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:48:50.617646 6835 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:48:50.617689 6835 factory.go:656] Stopping watch factory\\\\nI0313 11:48:50.617705 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:48:50.617732 6835 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:48:50.617746 6835 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:48:50.617763 6835 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:48:50.617773 6835 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:48:50.617784 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:48:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.295162 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.316406 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.333748 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.356097 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.373196 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.391554 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.409748 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.429513 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.439793 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:04 crc kubenswrapper[4786]: E0313 11:49:04.440025 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.466420 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.487075 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:04 crc kubenswrapper[4786]: I0313 11:49:04.508085 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:04Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.131047 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/2.log" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.136974 4786 scope.go:117] "RemoveContainer" containerID="cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884" Mar 13 11:49:05 crc kubenswrapper[4786]: E0313 11:49:05.137514 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.157218 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.179145 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.202825 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.223752 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.238175 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.257302 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.282309 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.297211 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.311437 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.327288 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.345965 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.361147 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.380176 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.401481 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.419784 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.440447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.440448 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:05 crc kubenswrapper[4786]: E0313 11:49:05.440611 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.440862 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:05 crc kubenswrapper[4786]: E0313 11:49:05.441036 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:05 crc kubenswrapper[4786]: E0313 11:49:05.441195 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.455323 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:05 crc kubenswrapper[4786]: I0313 11:49:05.476541 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:06 crc kubenswrapper[4786]: I0313 11:49:06.440416 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:06 crc kubenswrapper[4786]: E0313 11:49:06.440616 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:06 crc kubenswrapper[4786]: I0313 11:49:06.866278 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:06 crc kubenswrapper[4786]: E0313 11:49:06.866547 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:06 crc kubenswrapper[4786]: E0313 11:49:06.866670 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:22.866636334 +0000 UTC m=+150.146289821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:07 crc kubenswrapper[4786]: I0313 11:49:07.440478 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:07 crc kubenswrapper[4786]: I0313 11:49:07.440530 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:07 crc kubenswrapper[4786]: E0313 11:49:07.440697 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:07 crc kubenswrapper[4786]: I0313 11:49:07.440803 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:07 crc kubenswrapper[4786]: E0313 11:49:07.441066 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:07 crc kubenswrapper[4786]: E0313 11:49:07.441231 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:08 crc kubenswrapper[4786]: I0313 11:49:08.439907 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:08 crc kubenswrapper[4786]: E0313 11:49:08.440608 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:08 crc kubenswrapper[4786]: E0313 11:49:08.632101 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:09 crc kubenswrapper[4786]: I0313 11:49:09.440237 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:09 crc kubenswrapper[4786]: I0313 11:49:09.440257 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:09 crc kubenswrapper[4786]: E0313 11:49:09.441357 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:09 crc kubenswrapper[4786]: I0313 11:49:09.440318 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:09 crc kubenswrapper[4786]: E0313 11:49:09.441478 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:09 crc kubenswrapper[4786]: E0313 11:49:09.441519 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:09 crc kubenswrapper[4786]: I0313 11:49:09.454001 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 11:49:10 crc kubenswrapper[4786]: I0313 11:49:10.440139 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:10 crc kubenswrapper[4786]: E0313 11:49:10.440332 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:11 crc kubenswrapper[4786]: I0313 11:49:11.440552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:11 crc kubenswrapper[4786]: I0313 11:49:11.440624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:11 crc kubenswrapper[4786]: I0313 11:49:11.440652 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:11 crc kubenswrapper[4786]: E0313 11:49:11.440757 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:11 crc kubenswrapper[4786]: E0313 11:49:11.440871 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:11 crc kubenswrapper[4786]: E0313 11:49:11.441018 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.431665 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.431722 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.431738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.431761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.431779 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:12Z","lastTransitionTime":"2026-03-13T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.440059 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.440428 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.453253 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:12Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.459436 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.459477 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.459493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.459518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.459540 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:12Z","lastTransitionTime":"2026-03-13T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.479914 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:12Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.485756 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.485846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.485865 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.485920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.485938 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:12Z","lastTransitionTime":"2026-03-13T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.510856 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:12Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.516575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.516641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.516660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.516685 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.516703 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:12Z","lastTransitionTime":"2026-03-13T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.538520 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:12Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.543988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.544082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.544100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.544126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:12 crc kubenswrapper[4786]: I0313 11:49:12.544143 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:12Z","lastTransitionTime":"2026-03-13T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.564599 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:12Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:12 crc kubenswrapper[4786]: E0313 11:49:12.564971 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.439626 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.439676 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:13 crc kubenswrapper[4786]: E0313 11:49:13.440606 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.440644 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:13 crc kubenswrapper[4786]: E0313 11:49:13.441002 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:13 crc kubenswrapper[4786]: E0313 11:49:13.440767 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.456253 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.477920 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.499620 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.512104 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.529609 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.549999 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.569493 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.591019 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.606683 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.624724 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: E0313 11:49:13.633878 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.656496 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.672932 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.689978 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.707612 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.739101 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.759131 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.778016 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:13 crc kubenswrapper[4786]: I0313 11:49:13.804945 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:13Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:14 crc kubenswrapper[4786]: I0313 11:49:14.439762 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:14 crc kubenswrapper[4786]: E0313 11:49:14.439999 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:15 crc kubenswrapper[4786]: I0313 11:49:15.440308 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:15 crc kubenswrapper[4786]: I0313 11:49:15.440360 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:15 crc kubenswrapper[4786]: I0313 11:49:15.440495 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:15 crc kubenswrapper[4786]: E0313 11:49:15.441365 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:15 crc kubenswrapper[4786]: E0313 11:49:15.441227 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:15 crc kubenswrapper[4786]: E0313 11:49:15.440733 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:15 crc kubenswrapper[4786]: I0313 11:49:15.455712 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 11:49:16 crc kubenswrapper[4786]: I0313 11:49:16.440527 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:16 crc kubenswrapper[4786]: E0313 11:49:16.440746 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:17 crc kubenswrapper[4786]: I0313 11:49:17.440071 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:17 crc kubenswrapper[4786]: E0313 11:49:17.440254 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:17 crc kubenswrapper[4786]: I0313 11:49:17.440485 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:17 crc kubenswrapper[4786]: I0313 11:49:17.440104 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:17 crc kubenswrapper[4786]: E0313 11:49:17.440734 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:17 crc kubenswrapper[4786]: E0313 11:49:17.440798 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:18 crc kubenswrapper[4786]: I0313 11:49:18.440073 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:18 crc kubenswrapper[4786]: E0313 11:49:18.440273 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:18 crc kubenswrapper[4786]: E0313 11:49:18.634771 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:19 crc kubenswrapper[4786]: I0313 11:49:19.439872 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:19 crc kubenswrapper[4786]: E0313 11:49:19.440183 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:19 crc kubenswrapper[4786]: I0313 11:49:19.440260 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:19 crc kubenswrapper[4786]: I0313 11:49:19.440725 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:19 crc kubenswrapper[4786]: E0313 11:49:19.440868 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:19 crc kubenswrapper[4786]: E0313 11:49:19.440943 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:19 crc kubenswrapper[4786]: I0313 11:49:19.442285 4786 scope.go:117] "RemoveContainer" containerID="cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884" Mar 13 11:49:19 crc kubenswrapper[4786]: E0313 11:49:19.442833 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:49:20 crc kubenswrapper[4786]: I0313 11:49:20.440404 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:20 crc kubenswrapper[4786]: E0313 11:49:20.440620 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:21 crc kubenswrapper[4786]: I0313 11:49:21.440547 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:21 crc kubenswrapper[4786]: I0313 11:49:21.440595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:21 crc kubenswrapper[4786]: E0313 11:49:21.440713 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:21 crc kubenswrapper[4786]: I0313 11:49:21.440730 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:21 crc kubenswrapper[4786]: E0313 11:49:21.440816 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:21 crc kubenswrapper[4786]: E0313 11:49:21.440970 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.439641 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.439866 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.714947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.715013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.715030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.715053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.715074 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:22Z","lastTransitionTime":"2026-03-13T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.735971 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:22Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.740686 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.740740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.740759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.740782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.740801 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:22Z","lastTransitionTime":"2026-03-13T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.762145 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:22Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.766343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.766433 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.766456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.766485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.766511 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:22Z","lastTransitionTime":"2026-03-13T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.784601 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:22Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.788772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.788814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.788826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.788843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.788854 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:22Z","lastTransitionTime":"2026-03-13T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.803826 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:22Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.808163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.808204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.808218 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.808234 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.808246 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:22Z","lastTransitionTime":"2026-03-13T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.821874 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:22Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.822057 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:22 crc kubenswrapper[4786]: I0313 11:49:22.946358 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.946735 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:22 crc kubenswrapper[4786]: E0313 11:49:22.946815 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:54.946795303 +0000 UTC m=+182.226448770 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.350719 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.350850 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.350912 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.350965 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.351177 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:27.351133476 +0000 UTC m=+214.630786953 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.351342 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:27.35130845 +0000 UTC m=+214.630961937 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.440089 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.440245 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.440315 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.443083 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.443220 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.443352 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.452267 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.452376 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452505 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:27.45246647 +0000 UTC m=+214.732119957 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452514 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452576 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452599 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.452614 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452658 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:27.452645105 +0000 UTC m=+214.732298582 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452828 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452863 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.452934 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.453033 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:27.452999104 +0000 UTC m=+214.732652601 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.462691 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.476947 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.491425 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.507764 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.530121 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.548797 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.562579 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.589195 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.621572 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: E0313 11:49:23.637170 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.639094 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.659823 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.677199 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.693355 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.714990 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.726405 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.740467 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.762973 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.780750 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:23 crc kubenswrapper[4786]: I0313 11:49:23.796503 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:24 crc kubenswrapper[4786]: I0313 11:49:24.440210 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:24 crc kubenswrapper[4786]: E0313 11:49:24.440416 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:25 crc kubenswrapper[4786]: I0313 11:49:25.440293 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:25 crc kubenswrapper[4786]: I0313 11:49:25.441039 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:25 crc kubenswrapper[4786]: I0313 11:49:25.441328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:25 crc kubenswrapper[4786]: E0313 11:49:25.441430 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:25 crc kubenswrapper[4786]: E0313 11:49:25.441548 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:25 crc kubenswrapper[4786]: E0313 11:49:25.441610 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.218453 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/0.log" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.218559 4786 generic.go:334] "Generic (PLEG): container finished" podID="cd2e61d0-5deb-4005-85b4-c6f5ae70fe62" containerID="40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1" exitCode=1 Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.218618 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerDied","Data":"40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1"} Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.219612 4786 scope.go:117] "RemoveContainer" containerID="40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.232770 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.250597 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.274317 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.285041 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.296120 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.305042 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.315824 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.338130 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.352812 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.364678 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.376288 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.389316 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.401467 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.413552 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.422361 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.437933 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.439659 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:26 crc kubenswrapper[4786]: E0313 11:49:26.439820 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.453254 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.474747 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:26 crc kubenswrapper[4786]: I0313 11:49:26.487107 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:26Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.225047 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/0.log" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.225127 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerStarted","Data":"e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec"} Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.243688 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.263377 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.282619 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.305014 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.328440 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.344658 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.365010 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.395787 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.412597 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.430369 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.440063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.440133 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:27 crc kubenswrapper[4786]: E0313 11:49:27.440211 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.440328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:27 crc kubenswrapper[4786]: E0313 11:49:27.440515 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:27 crc kubenswrapper[4786]: E0313 11:49:27.440692 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.448922 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.468646 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.482460 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.497170 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.517734 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.563058 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.588103 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.615536 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:27 crc kubenswrapper[4786]: I0313 11:49:27.634307 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:28 crc kubenswrapper[4786]: I0313 11:49:28.439549 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:28 crc kubenswrapper[4786]: E0313 11:49:28.440115 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:28 crc kubenswrapper[4786]: E0313 11:49:28.638868 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:29 crc kubenswrapper[4786]: I0313 11:49:29.440187 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:29 crc kubenswrapper[4786]: I0313 11:49:29.440237 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:29 crc kubenswrapper[4786]: I0313 11:49:29.440212 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:29 crc kubenswrapper[4786]: E0313 11:49:29.440344 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:29 crc kubenswrapper[4786]: E0313 11:49:29.440503 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:29 crc kubenswrapper[4786]: E0313 11:49:29.440667 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:30 crc kubenswrapper[4786]: I0313 11:49:30.440417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:30 crc kubenswrapper[4786]: E0313 11:49:30.440656 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:31 crc kubenswrapper[4786]: I0313 11:49:31.440549 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:31 crc kubenswrapper[4786]: I0313 11:49:31.440624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:31 crc kubenswrapper[4786]: E0313 11:49:31.440808 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:31 crc kubenswrapper[4786]: I0313 11:49:31.440867 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:31 crc kubenswrapper[4786]: E0313 11:49:31.441083 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:31 crc kubenswrapper[4786]: E0313 11:49:31.441273 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:32 crc kubenswrapper[4786]: I0313 11:49:32.440238 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:32 crc kubenswrapper[4786]: E0313 11:49:32.441204 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:32 crc kubenswrapper[4786]: I0313 11:49:32.441833 4786 scope.go:117] "RemoveContainer" containerID="cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.048437 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.048490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.048507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.048530 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.048548 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:33Z","lastTransitionTime":"2026-03-13T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.066802 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.070742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.070798 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.070811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.070831 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.070844 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:33Z","lastTransitionTime":"2026-03-13T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.083547 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.086542 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.086578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.086592 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.086611 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.086626 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:33Z","lastTransitionTime":"2026-03-13T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.097198 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.101222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.101274 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.101291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.101311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.101325 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:33Z","lastTransitionTime":"2026-03-13T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.115340 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.118587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.118618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.118629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.118644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.118656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:33Z","lastTransitionTime":"2026-03-13T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.132230 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.132445 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.248289 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/2.log" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.251638 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.252362 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.269991 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.286705 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.298368 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.316353 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.331608 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.354407 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.372174 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.381163 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.390643 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.409433 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.421913 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.435414 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.439836 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.439971 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.440048 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.440100 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.440212 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.440276 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.451514 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.474155 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.498260 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.508524 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.518119 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.532622 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.544858 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.555061 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.576041 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.598734 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.613337 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.625866 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: E0313 11:49:33.639864 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.645026 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.658392 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.676357 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.689664 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.710484 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.729298 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.753836 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.768612 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.788542 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.804037 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.822029 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.841994 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.862393 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:33 crc kubenswrapper[4786]: I0313 11:49:33.874477 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:33Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.257524 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/3.log" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.258520 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/2.log" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.262560 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" exitCode=1 Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.262612 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.262662 4786 scope.go:117] "RemoveContainer" containerID="cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.264243 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 11:49:34 crc kubenswrapper[4786]: E0313 11:49:34.264537 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.288680 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.306285 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.325172 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.345051 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.368962 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.386673 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.406785 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.421275 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.435360 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.439652 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:34 crc kubenswrapper[4786]: E0313 11:49:34.439876 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.467212 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2383f469e703c5abc52642a9143b66b9be221e7c223f21030084d7aaf10884\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:03Z\\\",\\\"message\\\":\\\"ame:}]\\\\nI0313 11:49:03.548274 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:03.547685 7072 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0313 11:49:03.547772 7072 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"l\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:33.485385 7396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:33.485404 7396 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 11:49:33.485396 7396 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:33.485452 7396 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:33.485484 7396 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:33.485582 7396 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.485812 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.504469 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.527790 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.556798 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.578665 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.597684 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.621284 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.640342 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:34 crc kubenswrapper[4786]: I0313 11:49:34.662016 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:34Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.269293 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/3.log" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.275631 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 11:49:35 crc kubenswrapper[4786]: E0313 11:49:35.276005 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.299775 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.316734 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.334419 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.356137 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.378503 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.398044 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.412680 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.430936 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.439945 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.440014 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4786]: E0313 11:49:35.440113 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.440138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4786]: E0313 11:49:35.440282 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:35 crc kubenswrapper[4786]: E0313 11:49:35.440348 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.442014 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.454851 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.477954 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"l\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:33.485385 7396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:33.485404 7396 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 11:49:33.485396 7396 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:33.485452 7396 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:33.485484 7396 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:33.485582 7396 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.493203 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.508004 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.530521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.542981 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.558604 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.576508 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.589982 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:35 crc kubenswrapper[4786]: I0313 11:49:35.608693 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4786]: I0313 11:49:36.440029 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:36 crc kubenswrapper[4786]: E0313 11:49:36.440557 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:37 crc kubenswrapper[4786]: I0313 11:49:37.440395 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:37 crc kubenswrapper[4786]: I0313 11:49:37.440447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:37 crc kubenswrapper[4786]: E0313 11:49:37.440539 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:37 crc kubenswrapper[4786]: E0313 11:49:37.440622 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:37 crc kubenswrapper[4786]: I0313 11:49:37.440700 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:37 crc kubenswrapper[4786]: E0313 11:49:37.440960 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:38 crc kubenswrapper[4786]: I0313 11:49:38.440116 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:38 crc kubenswrapper[4786]: E0313 11:49:38.440320 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:38 crc kubenswrapper[4786]: E0313 11:49:38.641212 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:39 crc kubenswrapper[4786]: I0313 11:49:39.439544 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:39 crc kubenswrapper[4786]: I0313 11:49:39.439624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:39 crc kubenswrapper[4786]: I0313 11:49:39.439667 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:39 crc kubenswrapper[4786]: E0313 11:49:39.439800 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:39 crc kubenswrapper[4786]: E0313 11:49:39.439872 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:39 crc kubenswrapper[4786]: E0313 11:49:39.440122 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:40 crc kubenswrapper[4786]: I0313 11:49:40.440144 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:40 crc kubenswrapper[4786]: E0313 11:49:40.440303 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:41 crc kubenswrapper[4786]: I0313 11:49:41.440253 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:41 crc kubenswrapper[4786]: I0313 11:49:41.440300 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:41 crc kubenswrapper[4786]: I0313 11:49:41.440381 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:41 crc kubenswrapper[4786]: E0313 11:49:41.440461 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:41 crc kubenswrapper[4786]: E0313 11:49:41.440609 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:41 crc kubenswrapper[4786]: E0313 11:49:41.440788 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:42 crc kubenswrapper[4786]: I0313 11:49:42.440410 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:42 crc kubenswrapper[4786]: E0313 11:49:42.440621 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.272809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.272985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.273012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.273036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.273059 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.292649 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.298143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.298202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.298223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.298248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.298310 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.319946 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.325274 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.325359 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.325378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.325439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.325458 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.347164 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.354567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.354636 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.354667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.354693 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.354713 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.377423 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.382480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.382528 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.382544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.382568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.382585 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.397776 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9070ab03-ef9a-4d2e-b143-43ffad1cba05\\\",\\\"systemUUID\\\":\\\"ed5189ac-f697-4058-b82e-47ba3df6ef92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.398518 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.439730 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.439735 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.439809 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.440495 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.440218 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.440685 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.461536 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b5xwr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"2026-03-13T11:48:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba\\\\n2026-03-13T11:48:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_95c0f0c1-d91a-4f07-9d02-6be9d0f688ba to /host/opt/cni/bin/\\\\n2026-03-13T11:48:40Z [verbose] multus-daemon started\\\\n2026-03-13T11:48:40Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:49:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp5ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b5xwr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.483729 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50c2fdac-f315-454a-b2f4-84a9d4c2f938\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e6fccfbd5cd2517989a4485851a172fcfb0694748a718c3c39fc8b429caf1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f3eef7f947700ef9b1ddfa4f949e726c37ad8ca78d21157ddeed3b04ef2af63\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:22Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 11:46:55.697054 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 11:46:55.700610 1 observer_polling.go:159] Starting file observer\\\\nI0313 11:46:55.736241 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 11:46:55.740897 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 11:47:22.280758 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 11:47:22.281040 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:47:21Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:47:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ba51ce6e836d7aa430c5c561cd3f77a359d4faf83d01a7fcbcc6456cac639e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42117c28729eb4288fd5ead4e17da72fa8533efe9e756afbe58d997e5b10603\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.515482 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a2c1f2d-b4f8-40c7-a697-2d1499b2bb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96c60e1452f32399f8806a45c2d5f9469a7ac7f22cbf69da352907766682636c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfa5e331b946bdbab289897c588488374a3b8758be28b05e21c25dd44231f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a75cd14645e1c613c378822351c102e08ab51f78d21cce4f212347acddb290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e0ac6d6f69cf3b988316b2aa28cac4f5b5cdde9e3f8963fe48ee35aae0a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b03c45da139b832367840a4009c324505bc3a6b57f03b57dc41fe7d53e34352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a74988d538115b26f4f7f0da40df10b5e9b7c232779e2a24027872018af4584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69b79457e70ab5e2565f3bbd32384e166a44211fefe32dbf8b3176f82ff11806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b7f4f655a215bd324f91bebbdc3af2896fd747e2def3c3ef32ed2e0b7687e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.535399 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.556044 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.575307 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06162327-b651-4f1e-b7be-600ac07a6b6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9063d4afe0cafac6c1d9553b6622a2a0d38754a6e849c97b83131b9b9b1688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://951a4d8baab78036ec3759469ac41b76b46edcfc8a312a55c026ef4bca9dcfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada8d5a8a63ff4fb27d6d5a7c258eea2bccf0c013e78f992fb0b8d277b7e80fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c6589bc648eef8f92489ef4575a04e0402f4abc5691bf880c91976a03ef116\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.598286 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929fc90ca515ef8e029f9d039c2007f6c019279fa498d258da538a74beebd9e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.623573 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6g54w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3bf7ec5-4cc0-41b9-b916-f1797cbe149c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502da12a4135988784e3ca121ef65ae5662d990d1c271405acd4e61ef21de022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60e9881eca5d3adad3db0dfef60acbfcc581c39b73ae06c1c4b5d8e00980b3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a8cc77753ea469415971fde1490ba280dac503dc946c0d657ef8e2ef58078a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55735d2d4ddc99f784e8f4ac7d3b9bc3e804cd69b1c5d5647507cd825d2912e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://118474fca61576a93648173e31dcaf1111cbbbf96589d0d790874ec71213b0d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ee29c835dd12ec67ca5c920e04d5a0cb392bc62594d21a8249653a382e563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b09718625112156c997cc88e50881936b40463275f46b56934357b8a5b338d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmgpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6g54w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: E0313 11:49:43.642976 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.644155 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7db6fd3-49ce-4311-850e-dcb4e4db3a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:47:54Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0313 11:47:54.200970 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:47:54.201064 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:47:54.201635 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-78686140/tls.crt::/tmp/serving-cert-78686140/tls.key\\\\\\\"\\\\nI0313 11:47:54.616630 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:47:54.625000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:47:54.625038 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:47:54.625072 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:47:54.625092 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:47:54.633778 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:47:54.633808 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633815 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:47:54.633820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:47:54.633825 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:47:54.633831 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:47:54.633835 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:47:54.633865 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:47:54.636057 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:47:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.661174 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8b6a732-c863-49d9-9908-fbfe0e8c5825\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:46:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://205a2b4489b584b19ee5d65ede3d7daf817741f9ebcf68fa0bd21b46d8cb3b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e4366c77e3a5c5f925f51015310b032af8c21ae396c5458e124f5399b110bd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:46:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:46:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.679445 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f4bb7ae4bb73cb7971c503a1c66227f1d8e6e28038c34eacb9d118aba4c687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.699801 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26962ebd3ae34bf41ab0fa6203d0826f55c5a3648eb2b1b13eab79603ae6d834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86106fa779fbc731b834d98720e13b9007ed9f6ac806b020d1859f6c56ff52b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.716099 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lwrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da7e4ae2-afe2-4408-921d-d9ecb7c8c803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb2aa374816da70dca60739bb4174da09f6ef1131d924a63be9b8f836e031e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqgd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lwrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.734405 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97b994ad-2b42-41b1-9976-bfe949acbc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d3cdb48f11acfc709991f6e652add5136402a6649c7f5769409890f31f8c4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11292cbcedce35968affb95b57caf66c68249fd87a2132316b31664b405e9a42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vht7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz9x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.750626 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c19009bf-0d5a-458f-8c3e-97bc203741b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjgjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g4pzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.770074 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.786045 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rln62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c50d3c3e-8ce8-4be6-9bf5-6c486c3a4df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6a6b1bec271a1a4554c783ba69cdc8b65aa89c110257c41073d31b5b58a90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6h48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rln62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.802669 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75da9242-3ddf-4eca-82df-a5fc998b0fdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c828590b4de7bc6de4009d212e3edae219a6f7840fdebf89f0d7cdacd47d34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9b4h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8ncs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4786]: I0313 11:49:43.834959 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:33Z\\\",\\\"message\\\":\\\"l\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:33.485385 7396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:33.485404 7396 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 11:49:33.485396 7396 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 11:49:33.485452 7396 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:33.485484 7396 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:33.485582 7396 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2cq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4z4th\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:44 crc kubenswrapper[4786]: I0313 11:49:44.440205 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:44 crc kubenswrapper[4786]: E0313 11:49:44.440408 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:45 crc kubenswrapper[4786]: I0313 11:49:45.440765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:45 crc kubenswrapper[4786]: I0313 11:49:45.440800 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:45 crc kubenswrapper[4786]: I0313 11:49:45.441105 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:45 crc kubenswrapper[4786]: E0313 11:49:45.441297 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:45 crc kubenswrapper[4786]: E0313 11:49:45.441400 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:45 crc kubenswrapper[4786]: E0313 11:49:45.441623 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:46 crc kubenswrapper[4786]: I0313 11:49:46.440043 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:46 crc kubenswrapper[4786]: E0313 11:49:46.441080 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:47 crc kubenswrapper[4786]: I0313 11:49:47.439767 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:47 crc kubenswrapper[4786]: I0313 11:49:47.439839 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:47 crc kubenswrapper[4786]: E0313 11:49:47.440011 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:47 crc kubenswrapper[4786]: I0313 11:49:47.440054 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:47 crc kubenswrapper[4786]: E0313 11:49:47.440257 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:47 crc kubenswrapper[4786]: E0313 11:49:47.440359 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:48 crc kubenswrapper[4786]: I0313 11:49:48.440552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:48 crc kubenswrapper[4786]: E0313 11:49:48.441626 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:48 crc kubenswrapper[4786]: I0313 11:49:48.441682 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 11:49:48 crc kubenswrapper[4786]: E0313 11:49:48.442577 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:49:48 crc kubenswrapper[4786]: E0313 11:49:48.644624 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:49 crc kubenswrapper[4786]: I0313 11:49:49.440442 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:49 crc kubenswrapper[4786]: E0313 11:49:49.440616 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:49 crc kubenswrapper[4786]: I0313 11:49:49.440720 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:49 crc kubenswrapper[4786]: I0313 11:49:49.440741 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:49 crc kubenswrapper[4786]: E0313 11:49:49.440955 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:49 crc kubenswrapper[4786]: E0313 11:49:49.441099 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:50 crc kubenswrapper[4786]: I0313 11:49:50.440265 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:50 crc kubenswrapper[4786]: E0313 11:49:50.440486 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:51 crc kubenswrapper[4786]: I0313 11:49:51.440157 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:51 crc kubenswrapper[4786]: I0313 11:49:51.440247 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:51 crc kubenswrapper[4786]: E0313 11:49:51.440335 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:51 crc kubenswrapper[4786]: I0313 11:49:51.440373 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:51 crc kubenswrapper[4786]: E0313 11:49:51.440565 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:51 crc kubenswrapper[4786]: E0313 11:49:51.440649 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:52 crc kubenswrapper[4786]: I0313 11:49:52.440036 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:52 crc kubenswrapper[4786]: E0313 11:49:52.440152 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.440399 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.440442 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:53 crc kubenswrapper[4786]: E0313 11:49:53.440612 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.440668 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:53 crc kubenswrapper[4786]: E0313 11:49:53.440834 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:53 crc kubenswrapper[4786]: E0313 11:49:53.440936 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.486164 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.486138824 podStartE2EDuration="44.486138824s" podCreationTimestamp="2026-03-13 11:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.465494728 +0000 UTC m=+180.745148205" watchObservedRunningTime="2026-03-13 11:49:53.486138824 +0000 UTC m=+180.765792331" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.519943 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=85.519856501 podStartE2EDuration="1m25.519856501s" podCreationTimestamp="2026-03-13 11:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.519660896 +0000 UTC m=+180.799314363" watchObservedRunningTime="2026-03-13 11:49:53.519856501 +0000 UTC m=+180.799509988" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.549209 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=88.549184837 podStartE2EDuration="1m28.549184837s" podCreationTimestamp="2026-03-13 11:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.535125021 +0000 UTC m=+180.814778468" watchObservedRunningTime="2026-03-13 11:49:53.549184837 +0000 UTC m=+180.828838304" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.581296 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6g54w" podStartSLOduration=120.581276539 podStartE2EDuration="2m0.581276539s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.580199989 +0000 UTC m=+180.859853446" watchObservedRunningTime="2026-03-13 11:49:53.581276539 +0000 UTC m=+180.860929986" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.592929 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz9x8" podStartSLOduration=120.592905729 podStartE2EDuration="2m0.592905729s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.592155178 +0000 UTC m=+180.871808635" watchObservedRunningTime="2026-03-13 11:49:53.592905729 +0000 UTC m=+180.872559196" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.605287 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.605330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.605345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.605364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.605378 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.637632 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rln62" podStartSLOduration=120.637611907 podStartE2EDuration="2m0.637611907s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.637504254 +0000 UTC m=+180.917157721" watchObservedRunningTime="2026-03-13 11:49:53.637611907 +0000 UTC m=+180.917265354" Mar 13 11:49:53 crc kubenswrapper[4786]: E0313 11:49:53.645848 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.652778 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podStartSLOduration=120.652752724 podStartE2EDuration="2m0.652752724s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.652025644 +0000 UTC m=+180.931679091" watchObservedRunningTime="2026-03-13 11:49:53.652752724 +0000 UTC m=+180.932406181" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.653501 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp"] Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.653948 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.655777 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.656027 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.657088 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.657684 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.705989 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lwrsl" podStartSLOduration=120.705968496 podStartE2EDuration="2m0.705968496s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.690841041 +0000 UTC m=+180.970494488" watchObservedRunningTime="2026-03-13 11:49:53.705968496 +0000 UTC m=+180.985621953" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.706137 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=38.70613218 podStartE2EDuration="38.70613218s" podCreationTimestamp="2026-03-13 11:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.704625369 +0000 UTC m=+180.984278846" watchObservedRunningTime="2026-03-13 11:49:53.70613218 +0000 UTC m=+180.985785637" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.729612 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=83.729594985 podStartE2EDuration="1m23.729594985s" podCreationTimestamp="2026-03-13 11:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.729454872 +0000 UTC m=+181.009108319" watchObservedRunningTime="2026-03-13 11:49:53.729594985 +0000 UTC m=+181.009248442" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.746648 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fc420142-79fb-4773-8339-0ecc28d65834-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.746742 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc420142-79fb-4773-8339-0ecc28d65834-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.746776 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fc420142-79fb-4773-8339-0ecc28d65834-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.746795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc420142-79fb-4773-8339-0ecc28d65834-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.746834 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc420142-79fb-4773-8339-0ecc28d65834-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.771712 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b5xwr" podStartSLOduration=120.771693922 podStartE2EDuration="2m0.771693922s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:53.770957803 +0000 UTC m=+181.050611290" watchObservedRunningTime="2026-03-13 11:49:53.771693922 +0000 UTC m=+181.051347379" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848030 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fc420142-79fb-4773-8339-0ecc28d65834-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848132 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fc420142-79fb-4773-8339-0ecc28d65834-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc420142-79fb-4773-8339-0ecc28d65834-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848249 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fc420142-79fb-4773-8339-0ecc28d65834-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848267 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc420142-79fb-4773-8339-0ecc28d65834-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848334 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc420142-79fb-4773-8339-0ecc28d65834-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.848376 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fc420142-79fb-4773-8339-0ecc28d65834-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.849169 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc420142-79fb-4773-8339-0ecc28d65834-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.855293 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc420142-79fb-4773-8339-0ecc28d65834-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.865691 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc420142-79fb-4773-8339-0ecc28d65834-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2fckp\" (UID: \"fc420142-79fb-4773-8339-0ecc28d65834\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: I0313 11:49:53.969406 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" Mar 13 11:49:53 crc kubenswrapper[4786]: W0313 11:49:53.995273 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc420142_79fb_4773_8339_0ecc28d65834.slice/crio-06d6944ff16bed5d038e0417c115706bb7dd9e762fc985da4f1ead93cfc4cbf8 WatchSource:0}: Error finding container 06d6944ff16bed5d038e0417c115706bb7dd9e762fc985da4f1ead93cfc4cbf8: Status 404 returned error can't find the container with id 06d6944ff16bed5d038e0417c115706bb7dd9e762fc985da4f1ead93cfc4cbf8 Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.348800 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" event={"ID":"fc420142-79fb-4773-8339-0ecc28d65834","Type":"ContainerStarted","Data":"18153cfcf0167726b887acc2508daff0caa41bd06c4f9c4584a883b6d169890a"} Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.348903 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" event={"ID":"fc420142-79fb-4773-8339-0ecc28d65834","Type":"ContainerStarted","Data":"06d6944ff16bed5d038e0417c115706bb7dd9e762fc985da4f1ead93cfc4cbf8"} Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.369453 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fckp" podStartSLOduration=121.36943036 podStartE2EDuration="2m1.36943036s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:49:54.368619479 +0000 UTC m=+181.648272996" watchObservedRunningTime="2026-03-13 11:49:54.36943036 +0000 UTC m=+181.649083817" Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.440671 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:54 crc kubenswrapper[4786]: E0313 11:49:54.440963 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.563003 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.573465 4786 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 11:49:54 crc kubenswrapper[4786]: I0313 11:49:54.961407 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:54 crc kubenswrapper[4786]: E0313 11:49:54.961617 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:54 crc kubenswrapper[4786]: E0313 11:49:54.961735 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs podName:c19009bf-0d5a-458f-8c3e-97bc203741b1 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:58.961707978 +0000 UTC m=+246.241361465 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs") pod "network-metrics-daemon-g4pzt" (UID: "c19009bf-0d5a-458f-8c3e-97bc203741b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:55 crc kubenswrapper[4786]: I0313 11:49:55.439595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:55 crc kubenswrapper[4786]: E0313 11:49:55.439792 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:55 crc kubenswrapper[4786]: I0313 11:49:55.439847 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:55 crc kubenswrapper[4786]: E0313 11:49:55.440047 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:55 crc kubenswrapper[4786]: I0313 11:49:55.439821 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:55 crc kubenswrapper[4786]: E0313 11:49:55.440581 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:56 crc kubenswrapper[4786]: I0313 11:49:56.440517 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:56 crc kubenswrapper[4786]: E0313 11:49:56.440976 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:57 crc kubenswrapper[4786]: I0313 11:49:57.440193 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:57 crc kubenswrapper[4786]: I0313 11:49:57.440277 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:57 crc kubenswrapper[4786]: I0313 11:49:57.440313 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:57 crc kubenswrapper[4786]: E0313 11:49:57.440432 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:57 crc kubenswrapper[4786]: E0313 11:49:57.440532 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:57 crc kubenswrapper[4786]: E0313 11:49:57.440686 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:58 crc kubenswrapper[4786]: I0313 11:49:58.439504 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:49:58 crc kubenswrapper[4786]: E0313 11:49:58.439703 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:49:58 crc kubenswrapper[4786]: E0313 11:49:58.647305 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:49:59 crc kubenswrapper[4786]: I0313 11:49:59.440531 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:59 crc kubenswrapper[4786]: I0313 11:49:59.440612 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:59 crc kubenswrapper[4786]: I0313 11:49:59.440554 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:59 crc kubenswrapper[4786]: E0313 11:49:59.440759 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:59 crc kubenswrapper[4786]: E0313 11:49:59.440862 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:59 crc kubenswrapper[4786]: E0313 11:49:59.441026 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:00 crc kubenswrapper[4786]: I0313 11:50:00.439975 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:00 crc kubenswrapper[4786]: E0313 11:50:00.440210 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:01 crc kubenswrapper[4786]: I0313 11:50:01.439822 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:01 crc kubenswrapper[4786]: I0313 11:50:01.439933 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:01 crc kubenswrapper[4786]: E0313 11:50:01.439991 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:01 crc kubenswrapper[4786]: I0313 11:50:01.440035 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:01 crc kubenswrapper[4786]: E0313 11:50:01.440133 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:01 crc kubenswrapper[4786]: E0313 11:50:01.440222 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:02 crc kubenswrapper[4786]: I0313 11:50:02.439787 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:02 crc kubenswrapper[4786]: E0313 11:50:02.440450 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:03 crc kubenswrapper[4786]: I0313 11:50:03.439818 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:03 crc kubenswrapper[4786]: I0313 11:50:03.439968 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:03 crc kubenswrapper[4786]: I0313 11:50:03.439972 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:03 crc kubenswrapper[4786]: E0313 11:50:03.441791 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:03 crc kubenswrapper[4786]: E0313 11:50:03.442011 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:03 crc kubenswrapper[4786]: E0313 11:50:03.442845 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:03 crc kubenswrapper[4786]: I0313 11:50:03.443670 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 11:50:03 crc kubenswrapper[4786]: E0313 11:50:03.444066 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4z4th_openshift-ovn-kubernetes(4fb3555e-af42-44e2-89e8-6f0a8d5d485c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" Mar 13 11:50:03 crc kubenswrapper[4786]: E0313 11:50:03.648696 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:04 crc kubenswrapper[4786]: I0313 11:50:04.439845 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:04 crc kubenswrapper[4786]: E0313 11:50:04.440060 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:05 crc kubenswrapper[4786]: I0313 11:50:05.439805 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:05 crc kubenswrapper[4786]: I0313 11:50:05.439913 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:05 crc kubenswrapper[4786]: E0313 11:50:05.440051 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:05 crc kubenswrapper[4786]: I0313 11:50:05.440146 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:05 crc kubenswrapper[4786]: E0313 11:50:05.440321 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:05 crc kubenswrapper[4786]: E0313 11:50:05.440619 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:06 crc kubenswrapper[4786]: I0313 11:50:06.439478 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:06 crc kubenswrapper[4786]: E0313 11:50:06.439692 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:07 crc kubenswrapper[4786]: I0313 11:50:07.440055 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4786]: I0313 11:50:07.440095 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:07 crc kubenswrapper[4786]: E0313 11:50:07.440183 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:07 crc kubenswrapper[4786]: I0313 11:50:07.440262 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:07 crc kubenswrapper[4786]: E0313 11:50:07.440376 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:07 crc kubenswrapper[4786]: E0313 11:50:07.440497 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:08 crc kubenswrapper[4786]: I0313 11:50:08.439496 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:08 crc kubenswrapper[4786]: E0313 11:50:08.439671 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:08 crc kubenswrapper[4786]: E0313 11:50:08.649958 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:09 crc kubenswrapper[4786]: I0313 11:50:09.440347 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:09 crc kubenswrapper[4786]: I0313 11:50:09.440417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:09 crc kubenswrapper[4786]: E0313 11:50:09.440588 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:09 crc kubenswrapper[4786]: I0313 11:50:09.440683 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:09 crc kubenswrapper[4786]: E0313 11:50:09.440796 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:09 crc kubenswrapper[4786]: E0313 11:50:09.440931 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:10 crc kubenswrapper[4786]: I0313 11:50:10.440486 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:10 crc kubenswrapper[4786]: E0313 11:50:10.441301 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:11 crc kubenswrapper[4786]: I0313 11:50:11.440323 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:11 crc kubenswrapper[4786]: I0313 11:50:11.440353 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:11 crc kubenswrapper[4786]: E0313 11:50:11.440563 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:11 crc kubenswrapper[4786]: E0313 11:50:11.440667 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:11 crc kubenswrapper[4786]: I0313 11:50:11.441234 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:11 crc kubenswrapper[4786]: E0313 11:50:11.441603 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.420624 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/1.log" Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.421503 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/0.log" Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.421579 4786 generic.go:334] "Generic (PLEG): container finished" podID="cd2e61d0-5deb-4005-85b4-c6f5ae70fe62" containerID="e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec" exitCode=1 Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.421636 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerDied","Data":"e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec"} Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.421724 4786 scope.go:117] "RemoveContainer" containerID="40d0cb775e8e61e92ffc31af22c570bc82d386e36c9a78b43787fc0aa0269ca1" Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.422611 4786 scope.go:117] "RemoveContainer" containerID="e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec" Mar 13 11:50:12 crc kubenswrapper[4786]: E0313 11:50:12.423067 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-b5xwr_openshift-multus(cd2e61d0-5deb-4005-85b4-c6f5ae70fe62)\"" pod="openshift-multus/multus-b5xwr" podUID="cd2e61d0-5deb-4005-85b4-c6f5ae70fe62" Mar 13 11:50:12 crc kubenswrapper[4786]: I0313 11:50:12.439765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:12 crc kubenswrapper[4786]: E0313 11:50:12.440074 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:13 crc kubenswrapper[4786]: I0313 11:50:13.426515 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/1.log" Mar 13 11:50:13 crc kubenswrapper[4786]: I0313 11:50:13.439835 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:13 crc kubenswrapper[4786]: I0313 11:50:13.440144 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:13 crc kubenswrapper[4786]: E0313 11:50:13.441048 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:13 crc kubenswrapper[4786]: I0313 11:50:13.441106 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:13 crc kubenswrapper[4786]: E0313 11:50:13.441234 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:13 crc kubenswrapper[4786]: E0313 11:50:13.441570 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:13 crc kubenswrapper[4786]: E0313 11:50:13.651222 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:14 crc kubenswrapper[4786]: I0313 11:50:14.440524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:14 crc kubenswrapper[4786]: E0313 11:50:14.441759 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:15 crc kubenswrapper[4786]: I0313 11:50:15.440007 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:15 crc kubenswrapper[4786]: I0313 11:50:15.440116 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:15 crc kubenswrapper[4786]: I0313 11:50:15.440206 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:15 crc kubenswrapper[4786]: E0313 11:50:15.441546 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:15 crc kubenswrapper[4786]: E0313 11:50:15.441534 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:15 crc kubenswrapper[4786]: E0313 11:50:15.441851 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:16 crc kubenswrapper[4786]: I0313 11:50:16.439774 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:16 crc kubenswrapper[4786]: E0313 11:50:16.440028 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:16 crc kubenswrapper[4786]: I0313 11:50:16.441003 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.377660 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g4pzt"] Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.439518 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:17 crc kubenswrapper[4786]: E0313 11:50:17.439740 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.439772 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:17 crc kubenswrapper[4786]: E0313 11:50:17.440063 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.439596 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:17 crc kubenswrapper[4786]: E0313 11:50:17.440306 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.445929 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/3.log" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.449945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerStarted","Data":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.449988 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:17 crc kubenswrapper[4786]: E0313 11:50:17.450131 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.451317 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:50:17 crc kubenswrapper[4786]: I0313 11:50:17.511809 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podStartSLOduration=144.511774292 podStartE2EDuration="2m24.511774292s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:17.510528898 +0000 UTC m=+204.790182405" watchObservedRunningTime="2026-03-13 11:50:17.511774292 +0000 UTC m=+204.791427769" Mar 13 11:50:18 crc kubenswrapper[4786]: E0313 11:50:18.653693 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:19 crc kubenswrapper[4786]: I0313 11:50:19.440722 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:19 crc kubenswrapper[4786]: I0313 11:50:19.440858 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:19 crc kubenswrapper[4786]: I0313 11:50:19.440722 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:19 crc kubenswrapper[4786]: E0313 11:50:19.441042 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:19 crc kubenswrapper[4786]: I0313 11:50:19.441110 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:19 crc kubenswrapper[4786]: E0313 11:50:19.441264 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:19 crc kubenswrapper[4786]: E0313 11:50:19.441408 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:19 crc kubenswrapper[4786]: E0313 11:50:19.441530 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:21 crc kubenswrapper[4786]: I0313 11:50:21.440018 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:21 crc kubenswrapper[4786]: I0313 11:50:21.440052 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:21 crc kubenswrapper[4786]: I0313 11:50:21.440104 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:21 crc kubenswrapper[4786]: E0313 11:50:21.440216 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:21 crc kubenswrapper[4786]: I0313 11:50:21.440292 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:21 crc kubenswrapper[4786]: E0313 11:50:21.440485 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:21 crc kubenswrapper[4786]: E0313 11:50:21.440604 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:21 crc kubenswrapper[4786]: E0313 11:50:21.440754 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:23 crc kubenswrapper[4786]: I0313 11:50:23.440382 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:23 crc kubenswrapper[4786]: I0313 11:50:23.440415 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:23 crc kubenswrapper[4786]: E0313 11:50:23.442447 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:23 crc kubenswrapper[4786]: I0313 11:50:23.442639 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:23 crc kubenswrapper[4786]: I0313 11:50:23.442698 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:23 crc kubenswrapper[4786]: E0313 11:50:23.442996 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:23 crc kubenswrapper[4786]: E0313 11:50:23.443056 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:23 crc kubenswrapper[4786]: E0313 11:50:23.443160 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:23 crc kubenswrapper[4786]: E0313 11:50:23.655362 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:24 crc kubenswrapper[4786]: I0313 11:50:24.440994 4786 scope.go:117] "RemoveContainer" containerID="e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec" Mar 13 11:50:25 crc kubenswrapper[4786]: I0313 11:50:25.439545 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:25 crc kubenswrapper[4786]: I0313 11:50:25.439544 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:25 crc kubenswrapper[4786]: I0313 11:50:25.439595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:25 crc kubenswrapper[4786]: I0313 11:50:25.439667 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:25 crc kubenswrapper[4786]: E0313 11:50:25.440202 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:25 crc kubenswrapper[4786]: E0313 11:50:25.440294 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:25 crc kubenswrapper[4786]: E0313 11:50:25.440397 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:25 crc kubenswrapper[4786]: E0313 11:50:25.440473 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:25 crc kubenswrapper[4786]: I0313 11:50:25.481053 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/1.log" Mar 13 11:50:25 crc kubenswrapper[4786]: I0313 11:50:25.481130 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerStarted","Data":"6827a12fecc9b0287ae0b64a23d85b0319b84398bbfce6c8aa49249074ac5ff4"} Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.436996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.437079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.437162 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.437241 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.437268 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:52:29.437244237 +0000 UTC m=+336.716897764 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.437295 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:52:29.437278367 +0000 UTC m=+336.716931824 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.440094 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.440128 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.440255 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.440323 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.440398 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.440323 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.440514 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g4pzt" podUID="c19009bf-0d5a-458f-8c3e-97bc203741b1" Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.440661 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.538254 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.538389 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:27 crc kubenswrapper[4786]: I0313 11:50:27.538426 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538553 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:52:29.53851417 +0000 UTC m=+336.818167667 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538562 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538626 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538670 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538697 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538634 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538806 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538778 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:52:29.538752566 +0000 UTC m=+336.818406053 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:27 crc kubenswrapper[4786]: E0313 11:50:27.538920 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:52:29.53890045 +0000 UTC m=+336.818553907 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.439627 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.439706 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.439807 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.440520 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.443500 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.443638 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.443722 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.444023 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.444026 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:50:29 crc kubenswrapper[4786]: I0313 11:50:29.444928 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.307079 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.356830 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8q2gx"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.357346 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.361385 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-76tgr"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.361832 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.364840 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.366138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.403085 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:50:34 crc kubenswrapper[4786]: W0313 11:50:34.403296 4786 reflector.go:561] object-"openshift-authentication-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 13 11:50:34 crc kubenswrapper[4786]: E0313 11:50:34.403366 4786 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.403190 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.404634 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.404864 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.405306 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.405861 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.406060 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7tbp9"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.406674 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.407464 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.407834 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.408114 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7slx"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.408254 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.408710 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.410251 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.410387 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.424767 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.428191 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.440197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t96f\" (UniqueName: \"kubernetes.io/projected/8a3e1982-e5ea-40c7-b606-7b4464d32971-kube-api-access-2t96f\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.440275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.440324 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e1982-e5ea-40c7-b606-7b4464d32971-serving-cert\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.440355 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-client-ca\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.440512 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-config\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.446979 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.447030 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.478695 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.488477 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.488615 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.488621 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.489833 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.490365 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.492279 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.492644 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493252 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493278 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493469 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493594 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493675 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493715 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.493902 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.494458 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.494584 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.494470 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.494799 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.494819 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.494939 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.495022 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.495052 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.495511 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.495665 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.495763 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.498784 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.499397 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.544739 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.553000 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.555080 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.555248 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.555399 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.555487 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.555829 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.558394 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6rd6k"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.560560 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.561793 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.562276 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.562477 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.562500 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.562606 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.562716 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.562991 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.563421 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564420 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qk2r\" (UniqueName: \"kubernetes.io/projected/01831a9e-e080-4ede-905a-34277de02b46-kube-api-access-4qk2r\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564450 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wvn\" (UniqueName: \"kubernetes.io/projected/145932f5-ae28-4331-8bb2-a7fa535d7f96-kube-api-access-44wvn\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564467 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564487 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e090577d-dd68-4f18-b70a-836560c655ce-audit-dir\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/01831a9e-e080-4ede-905a-34277de02b46-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564535 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6wx2\" (UniqueName: \"kubernetes.io/projected/1c3df92f-12d8-45f6-9369-c8b12f933e3c-kube-api-access-f6wx2\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564574 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564590 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3df92f-12d8-45f6-9369-c8b12f933e3c-serving-cert\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564605 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-etcd-client\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564620 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-serving-cert\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e1982-e5ea-40c7-b606-7b4464d32971-serving-cert\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-service-ca-bundle\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564696 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/01831a9e-e080-4ede-905a-34277de02b46-images\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564711 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-audit-policies\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564727 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564755 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564773 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564792 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t96f\" (UniqueName: \"kubernetes.io/projected/8a3e1982-e5ea-40c7-b606-7b4464d32971-kube-api-access-2t96f\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564810 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564826 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-config\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564859 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564954 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/145932f5-ae28-4331-8bb2-a7fa535d7f96-audit-dir\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564969 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.564994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpkz4\" (UniqueName: \"kubernetes.io/projected/1f2ac51c-103a-4e6c-806b-498c330fa36a-kube-api-access-lpkz4\") pod \"cluster-samples-operator-665b6dd947-z5l8m\" (UID: \"1f2ac51c-103a-4e6c-806b-498c330fa36a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.563434 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.565564 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.566181 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-encryption-config\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01831a9e-e080-4ede-905a-34277de02b46-config\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567587 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567604 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-audit-policies\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567620 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-client-ca\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567655 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567674 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1f2ac51c-103a-4e6c-806b-498c330fa36a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z5l8m\" (UID: \"1f2ac51c-103a-4e6c-806b-498c330fa36a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567692 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-config\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.567710 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4967\" (UniqueName: \"kubernetes.io/projected/e090577d-dd68-4f18-b70a-836560c655ce-kube-api-access-n4967\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.568408 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-client-ca\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.569317 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-config\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.570921 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8g6q"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.571320 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.571345 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.573297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e1982-e5ea-40c7-b606-7b4464d32971-serving-cert\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.575166 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.577792 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.577977 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.578448 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.578696 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.579338 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.581547 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pckgr"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.582054 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.582337 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.582413 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.582582 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.582875 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.582896 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.585952 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.586377 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cgkm9"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.586826 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.586867 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.587213 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.587724 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.587834 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4rkw"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588313 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588453 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588467 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588638 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588714 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588642 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588684 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588815 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.588894 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.590925 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.591436 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.592146 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nzss5"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.592440 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.593209 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hk9g4"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.596810 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.598100 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.598565 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.599841 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.605931 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.606430 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.606483 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.606843 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.607236 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.612257 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.612667 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.619127 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.619361 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.619466 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.620907 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.621072 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.621813 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622217 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622292 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622331 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622530 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622767 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622803 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.622985 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.623001 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.623174 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.625559 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626003 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626027 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626137 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626195 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626267 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626327 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626397 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.626411 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.628336 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.628672 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.630332 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.630441 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.630664 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.631618 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.632663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t96f\" (UniqueName: \"kubernetes.io/projected/8a3e1982-e5ea-40c7-b606-7b4464d32971-kube-api-access-2t96f\") pod \"controller-manager-879f6c89f-8q2gx\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.634990 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.635623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.640141 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.640135 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.640918 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.643910 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d5hqz"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.644624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.657243 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j6thk"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.657897 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.657972 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vcflh"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.658552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.658803 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.659232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.659777 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.660641 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.662229 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.663612 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-76tgr"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.664393 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8q2gx"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.665296 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.665924 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.666090 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.666217 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-62rhz"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.666756 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.670096 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.670791 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.670938 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19edfa3c-b549-4376-b53d-d0f8a448bdec-serving-cert\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.671098 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.671600 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-serving-cert\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.671796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/145932f5-ae28-4331-8bb2-a7fa535d7f96-audit-dir\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.672288 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/145932f5-ae28-4331-8bb2-a7fa535d7f96-audit-dir\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.672453 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vw8v9"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.672200 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.672489 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.673210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpkz4\" (UniqueName: \"kubernetes.io/projected/1f2ac51c-103a-4e6c-806b-498c330fa36a-kube-api-access-lpkz4\") pod \"cluster-samples-operator-665b6dd947-z5l8m\" (UID: \"1f2ac51c-103a-4e6c-806b-498c330fa36a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.673209 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674534 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b96faa7c-3975-4af4-8456-42ed7bbf9897-srv-cert\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674592 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-encryption-config\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674623 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmqh\" (UniqueName: \"kubernetes.io/projected/881358ff-be80-45e8-878e-445fcd8f8bda-kube-api-access-4gmqh\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674832 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01831a9e-e080-4ede-905a-34277de02b46-config\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674866 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-oauth-config\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.674896 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d7e088-128a-4842-832e-c78fdfe99913-proxy-tls\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3d7e088-128a-4842-832e-c78fdfe99913-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-audit-policies\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdtjw\" (UniqueName: \"kubernetes.io/projected/d94210f8-5f1d-4fa5-8954-14d18f8fa0e4-kube-api-access-hdtjw\") pod \"control-plane-machine-set-operator-78cbb6b69f-5h7bj\" (UID: \"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-service-ca\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675228 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675253 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1f2ac51c-103a-4e6c-806b-498c330fa36a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z5l8m\" (UID: \"1f2ac51c-103a-4e6c-806b-498c330fa36a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhb5n\" (UniqueName: \"kubernetes.io/projected/19edfa3c-b549-4376-b53d-d0f8a448bdec-kube-api-access-dhb5n\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4967\" (UniqueName: \"kubernetes.io/projected/e090577d-dd68-4f18-b70a-836560c655ce-kube-api-access-n4967\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881358ff-be80-45e8-878e-445fcd8f8bda-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675342 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-etcd-client\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675359 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/881358ff-be80-45e8-878e-445fcd8f8bda-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675374 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-image-import-ca\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675388 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-encryption-config\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675403 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-oauth-serving-cert\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675419 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wvn\" (UniqueName: \"kubernetes.io/projected/145932f5-ae28-4331-8bb2-a7fa535d7f96-kube-api-access-44wvn\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qk2r\" (UniqueName: \"kubernetes.io/projected/01831a9e-e080-4ede-905a-34277de02b46-kube-api-access-4qk2r\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e3d7e088-128a-4842-832e-c78fdfe99913-images\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8412032b-b4df-4687-9631-f8bb7da83696-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d94210f8-5f1d-4fa5-8954-14d18f8fa0e4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5h7bj\" (UID: \"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675533 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e090577d-dd68-4f18-b70a-836560c655ce-audit-dir\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/01831a9e-e080-4ede-905a-34277de02b46-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675697 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6wx2\" (UniqueName: \"kubernetes.io/projected/1c3df92f-12d8-45f6-9369-c8b12f933e3c-kube-api-access-f6wx2\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675722 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxdlx\" (UniqueName: \"kubernetes.io/projected/e3d7e088-128a-4842-832e-c78fdfe99913-kube-api-access-wxdlx\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675741 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-audit\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675778 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77ch4\" (UniqueName: \"kubernetes.io/projected/5cc708e6-8514-40a4-965a-991f84a8f0d4-kube-api-access-77ch4\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjxm\" (UniqueName: \"kubernetes.io/projected/b96faa7c-3975-4af4-8456-42ed7bbf9897-kube-api-access-rsjxm\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675828 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3df92f-12d8-45f6-9369-c8b12f933e3c-serving-cert\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675845 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-etcd-client\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675850 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e090577d-dd68-4f18-b70a-836560c655ce-audit-dir\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675863 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01831a9e-e080-4ede-905a-34277de02b46-config\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.675895 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676085 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-serving-cert\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676141 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrlb\" (UniqueName: \"kubernetes.io/projected/21153583-f155-40cd-b0f0-1841cdb9c20d-kube-api-access-jsrlb\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8412032b-b4df-4687-9631-f8bb7da83696-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-config\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676193 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8412032b-b4df-4687-9631-f8bb7da83696-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-service-ca-bundle\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c989z\" (UniqueName: \"kubernetes.io/projected/8412032b-b4df-4687-9631-f8bb7da83696-kube-api-access-c989z\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676246 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-serving-cert\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676263 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46p2f\" (UniqueName: \"kubernetes.io/projected/ed0ec184-b55e-474a-9e11-72957a85689d-kube-api-access-46p2f\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676284 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676308 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/01831a9e-e080-4ede-905a-34277de02b46-images\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676332 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-audit-policies\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676350 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19edfa3c-b549-4376-b53d-d0f8a448bdec-config\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.676415 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.677123 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-audit-policies\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.677591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21153583-f155-40cd-b0f0-1841cdb9c20d-node-pullsecrets\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.677634 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.677658 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.677678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnd82\" (UniqueName: \"kubernetes.io/projected/b7a97bba-7000-4634-9cfe-efcc38685708-kube-api-access-vnd82\") pod \"downloads-7954f5f757-6rd6k\" (UID: \"b7a97bba-7000-4634-9cfe-efcc38685708\") " pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678249 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/01831a9e-e080-4ede-905a-34277de02b46-images\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678402 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678467 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b96faa7c-3975-4af4-8456-42ed7bbf9897-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-etcd-serving-ca\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-service-ca-bundle\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678209 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.678663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679100 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-audit-policies\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679460 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/145932f5-ae28-4331-8bb2-a7fa535d7f96-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679528 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-config\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679595 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679565 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc708e6-8514-40a4-965a-991f84a8f0d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-trusted-ca-bundle\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679762 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21153583-f155-40cd-b0f0-1841cdb9c20d-audit-dir\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679817 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cc708e6-8514-40a4-965a-991f84a8f0d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-console-config\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.679973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.680012 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-config\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.687672 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.688157 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c3df92f-12d8-45f6-9369-c8b12f933e3c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.688361 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.688457 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.688816 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/01831a9e-e080-4ede-905a-34277de02b46-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.688835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3df92f-12d8-45f6-9369-c8b12f933e3c-serving-cert\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.689163 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.689245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.689509 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.689717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.690485 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1f2ac51c-103a-4e6c-806b-498c330fa36a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z5l8m\" (UID: \"1f2ac51c-103a-4e6c-806b-498c330fa36a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.696111 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.696873 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.697295 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.697797 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.698099 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556710-bmgbc"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.698417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.698913 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.699095 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.699330 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-encryption-config\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.699924 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.700024 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.700154 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-serving-cert\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.700315 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.702341 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7tbp9"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.702421 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.705063 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.706258 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7slx"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.707407 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/145932f5-ae28-4331-8bb2-a7fa535d7f96-etcd-client\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.707755 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.708303 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.709206 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.713404 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vcflh"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.716184 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.717149 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d5hqz"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.718268 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.719158 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.720300 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.720336 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.721295 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4rkw"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.722595 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.723723 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pckgr"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.724680 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8g6q"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.725673 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.726684 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.727690 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qbr8s"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.728649 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2gm6m"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.728697 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.729302 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.729629 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cgkm9"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.730614 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6rd6k"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.731599 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-62rhz"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.732576 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.733706 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.735909 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j6thk"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.736861 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.737847 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.738863 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nzss5"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.739130 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.739841 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.740812 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.742035 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.743001 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.744149 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2gm6m"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.744954 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qbr8s"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.745964 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vw8v9"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.746984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-bmgbc"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.748006 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.748945 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fgsg7"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.750237 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hmxw2"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.750664 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.750668 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.751004 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.752242 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fgsg7"] Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.759689 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.779784 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792428 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21153583-f155-40cd-b0f0-1841cdb9c20d-audit-dir\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792466 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8mb8\" (UniqueName: \"kubernetes.io/projected/f4eaf640-4e83-4528-ac9b-52a663fd5f05-kube-api-access-t8mb8\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792488 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlcdc\" (UniqueName: \"kubernetes.io/projected/d7b8ca98-334c-438d-91ba-88b66fa36789-kube-api-access-zlcdc\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792508 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p4sb\" (UniqueName: \"kubernetes.io/projected/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-kube-api-access-5p4sb\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792526 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19edfa3c-b549-4376-b53d-d0f8a448bdec-serving-cert\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792530 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21153583-f155-40cd-b0f0-1841cdb9c20d-audit-dir\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmqh\" (UniqueName: \"kubernetes.io/projected/881358ff-be80-45e8-878e-445fcd8f8bda-kube-api-access-4gmqh\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792575 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d7e088-128a-4842-832e-c78fdfe99913-proxy-tls\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-signing-cabundle\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792607 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-service-ca\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792637 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hbx\" (UniqueName: \"kubernetes.io/projected/d27bcc46-4fe6-47b9-8d99-9581308e512a-kube-api-access-49hbx\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792662 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-etcd-client\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792679 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792696 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-image-import-ca\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792712 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-encryption-config\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m96jn\" (UniqueName: \"kubernetes.io/projected/b09d2cd7-622a-41cb-be28-c6bd24ae1267-kube-api-access-m96jn\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792748 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d94210f8-5f1d-4fa5-8954-14d18f8fa0e4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5h7bj\" (UID: \"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-metrics-tls\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792783 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cb2b03d4-6c20-45cf-97a9-26ba24eff125-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792808 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxdlx\" (UniqueName: \"kubernetes.io/projected/e3d7e088-128a-4842-832e-c78fdfe99913-kube-api-access-wxdlx\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792824 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b8ca98-334c-438d-91ba-88b66fa36789-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rhm\" (UniqueName: \"kubernetes.io/projected/096982f0-2695-4021-9db9-a705d08903b0-kube-api-access-m6rhm\") pod \"dns-operator-744455d44c-cgkm9\" (UID: \"096982f0-2695-4021-9db9-a705d08903b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792858 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhtj\" (UniqueName: \"kubernetes.io/projected/db9f1421-b65b-4929-a789-41038fa70ea8-kube-api-access-2zhtj\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792893 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db9f1421-b65b-4929-a789-41038fa70ea8-machine-approver-tls\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.792954 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d27bcc46-4fe6-47b9-8d99-9581308e512a-proxy-tls\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-client-ca\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793125 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-config\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793179 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c989z\" (UniqueName: \"kubernetes.io/projected/8412032b-b4df-4687-9631-f8bb7da83696-kube-api-access-c989z\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-serving-cert\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793235 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf640-4e83-4528-ac9b-52a663fd5f05-service-ca-bundle\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793300 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19edfa3c-b549-4376-b53d-d0f8a448bdec-config\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793492 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-service-ca\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793724 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-config\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793787 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnd82\" (UniqueName: \"kubernetes.io/projected/b7a97bba-7000-4634-9cfe-efcc38685708-kube-api-access-vnd82\") pod \"downloads-7954f5f757-6rd6k\" (UID: \"b7a97bba-7000-4634-9cfe-efcc38685708\") " pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793924 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b96faa7c-3975-4af4-8456-42ed7bbf9897-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.793942 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-etcd-serving-ca\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794264 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc708e6-8514-40a4-965a-991f84a8f0d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794348 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794496 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db9f1421-b65b-4929-a789-41038fa70ea8-config\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794549 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-ca\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb2b03d4-6c20-45cf-97a9-26ba24eff125-serving-cert\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794636 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cc708e6-8514-40a4-965a-991f84a8f0d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-console-config\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794689 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794702 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-stats-auth\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794745 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-serving-cert\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794768 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b96faa7c-3975-4af4-8456-42ed7bbf9897-srv-cert\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794805 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-metrics-certs\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-config\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qvd\" (UniqueName: \"kubernetes.io/projected/cb2b03d4-6c20-45cf-97a9-26ba24eff125-kube-api-access-98qvd\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794853 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-service-ca\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.794981 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-oauth-config\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795006 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdtjw\" (UniqueName: \"kubernetes.io/projected/d94210f8-5f1d-4fa5-8954-14d18f8fa0e4-kube-api-access-hdtjw\") pod \"control-plane-machine-set-operator-78cbb6b69f-5h7bj\" (UID: \"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795044 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3d7e088-128a-4842-832e-c78fdfe99913-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795062 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhb5n\" (UniqueName: \"kubernetes.io/projected/19edfa3c-b549-4376-b53d-d0f8a448bdec-kube-api-access-dhb5n\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795078 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881358ff-be80-45e8-878e-445fcd8f8bda-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795095 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09d2cd7-622a-41cb-be28-c6bd24ae1267-serving-cert\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795162 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096982f0-2695-4021-9db9-a705d08903b0-metrics-tls\") pod \"dns-operator-744455d44c-cgkm9\" (UID: \"096982f0-2695-4021-9db9-a705d08903b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/881358ff-be80-45e8-878e-445fcd8f8bda-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795195 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-oauth-serving-cert\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc708e6-8514-40a4-965a-991f84a8f0d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795242 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795261 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e3d7e088-128a-4842-832e-c78fdfe99913-images\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8412032b-b4df-4687-9631-f8bb7da83696-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795312 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zs2h\" (UniqueName: \"kubernetes.io/projected/6a94b2e7-89b4-43f9-b7f6-3a433803914e-kube-api-access-9zs2h\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795330 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-config\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-default-certificate\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77ch4\" (UniqueName: \"kubernetes.io/projected/5cc708e6-8514-40a4-965a-991f84a8f0d4-kube-api-access-77ch4\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795461 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjxm\" (UniqueName: \"kubernetes.io/projected/b96faa7c-3975-4af4-8456-42ed7bbf9897-kube-api-access-rsjxm\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795499 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-audit\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.795852 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-image-import-ca\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796031 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-console-config\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796432 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3d7e088-128a-4842-832e-c78fdfe99913-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796442 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-serving-cert\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-oauth-serving-cert\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796569 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-client\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796597 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrlb\" (UniqueName: \"kubernetes.io/projected/21153583-f155-40cd-b0f0-1841cdb9c20d-kube-api-access-jsrlb\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796636 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-signing-key\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796657 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8412032b-b4df-4687-9631-f8bb7da83696-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796675 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b8ca98-334c-438d-91ba-88b66fa36789-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796768 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-etcd-serving-ca\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796780 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881358ff-be80-45e8-878e-445fcd8f8bda-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796719 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8412032b-b4df-4687-9631-f8bb7da83696-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a94b2e7-89b4-43f9-b7f6-3a433803914e-serving-cert\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.796966 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46p2f\" (UniqueName: \"kubernetes.io/projected/ed0ec184-b55e-474a-9e11-72957a85689d-kube-api-access-46p2f\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db9f1421-b65b-4929-a789-41038fa70ea8-auth-proxy-config\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d27bcc46-4fe6-47b9-8d99-9581308e512a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797219 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcpp2\" (UniqueName: \"kubernetes.io/projected/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-kube-api-access-dcpp2\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/21153583-f155-40cd-b0f0-1841cdb9c20d-audit\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797279 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21153583-f155-40cd-b0f0-1841cdb9c20d-node-pullsecrets\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797332 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21153583-f155-40cd-b0f0-1841cdb9c20d-node-pullsecrets\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797419 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-trusted-ca\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797452 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-config\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.797528 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-trusted-ca-bundle\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.798623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-trusted-ca-bundle\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.799475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8412032b-b4df-4687-9631-f8bb7da83696-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.799895 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.799972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/881358ff-be80-45e8-878e-445fcd8f8bda-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.800040 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-etcd-client\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.801307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-oauth-config\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.801938 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-serving-cert\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.805373 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cc708e6-8514-40a4-965a-991f84a8f0d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.805482 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8412032b-b4df-4687-9631-f8bb7da83696-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.819531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21153583-f155-40cd-b0f0-1841cdb9c20d-encryption-config\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.820016 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.839313 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.879983 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.895816 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b96faa7c-3975-4af4-8456-42ed7bbf9897-srv-cert\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898376 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-signing-cabundle\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hbx\" (UniqueName: \"kubernetes.io/projected/d27bcc46-4fe6-47b9-8d99-9581308e512a-kube-api-access-49hbx\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898736 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m96jn\" (UniqueName: \"kubernetes.io/projected/b09d2cd7-622a-41cb-be28-c6bd24ae1267-kube-api-access-m96jn\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-metrics-tls\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898838 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cb2b03d4-6c20-45cf-97a9-26ba24eff125-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898875 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b8ca98-334c-438d-91ba-88b66fa36789-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rhm\" (UniqueName: \"kubernetes.io/projected/096982f0-2695-4021-9db9-a705d08903b0-kube-api-access-m6rhm\") pod \"dns-operator-744455d44c-cgkm9\" (UID: \"096982f0-2695-4021-9db9-a705d08903b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.898999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhtj\" (UniqueName: \"kubernetes.io/projected/db9f1421-b65b-4929-a789-41038fa70ea8-kube-api-access-2zhtj\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db9f1421-b65b-4929-a789-41038fa70ea8-machine-approver-tls\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d27bcc46-4fe6-47b9-8d99-9581308e512a-proxy-tls\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-client-ca\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf640-4e83-4528-ac9b-52a663fd5f05-service-ca-bundle\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899319 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899361 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db9f1421-b65b-4929-a789-41038fa70ea8-config\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899394 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-ca\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899426 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb2b03d4-6c20-45cf-97a9-26ba24eff125-serving-cert\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-stats-auth\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899493 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-metrics-certs\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-config\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qvd\" (UniqueName: \"kubernetes.io/projected/cb2b03d4-6c20-45cf-97a9-26ba24eff125-kube-api-access-98qvd\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899819 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-service-ca\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.899778 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cb2b03d4-6c20-45cf-97a9-26ba24eff125-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900147 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09d2cd7-622a-41cb-be28-c6bd24ae1267-serving-cert\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900191 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096982f0-2695-4021-9db9-a705d08903b0-metrics-tls\") pod \"dns-operator-744455d44c-cgkm9\" (UID: \"096982f0-2695-4021-9db9-a705d08903b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900303 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zs2h\" (UniqueName: \"kubernetes.io/projected/6a94b2e7-89b4-43f9-b7f6-3a433803914e-kube-api-access-9zs2h\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900336 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-config\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900368 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-default-certificate\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-client\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-signing-key\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900512 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b8ca98-334c-438d-91ba-88b66fa36789-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900543 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a94b2e7-89b4-43f9-b7f6-3a433803914e-serving-cert\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900553 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-client-ca\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900584 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db9f1421-b65b-4929-a789-41038fa70ea8-auth-proxy-config\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d27bcc46-4fe6-47b9-8d99-9581308e512a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcpp2\" (UniqueName: \"kubernetes.io/projected/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-kube-api-access-dcpp2\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900689 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-trusted-ca\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900726 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-config\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900747 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8mb8\" (UniqueName: \"kubernetes.io/projected/f4eaf640-4e83-4528-ac9b-52a663fd5f05-kube-api-access-t8mb8\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900812 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlcdc\" (UniqueName: \"kubernetes.io/projected/d7b8ca98-334c-438d-91ba-88b66fa36789-kube-api-access-zlcdc\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.900853 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4sb\" (UniqueName: \"kubernetes.io/projected/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-kube-api-access-5p4sb\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.901070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf640-4e83-4528-ac9b-52a663fd5f05-service-ca-bundle\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.901455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-ca\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.901964 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-config\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.902073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-metrics-tls\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.902138 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-service-ca\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.902923 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-trusted-ca\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.903270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d27bcc46-4fe6-47b9-8d99-9581308e512a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.903960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096982f0-2695-4021-9db9-a705d08903b0-metrics-tls\") pod \"dns-operator-744455d44c-cgkm9\" (UID: \"096982f0-2695-4021-9db9-a705d08903b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.903976 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-config\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.904467 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-metrics-certs\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.905356 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09d2cd7-622a-41cb-be28-c6bd24ae1267-serving-cert\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.906522 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a94b2e7-89b4-43f9-b7f6-3a433803914e-serving-cert\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.906624 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-default-certificate\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.907592 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b09d2cd7-622a-41cb-be28-c6bd24ae1267-etcd-client\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.908097 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4eaf640-4e83-4528-ac9b-52a663fd5f05-stats-auth\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.921134 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.940256 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.949342 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b96faa7c-3975-4af4-8456-42ed7bbf9897-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.960191 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 11:50:34 crc kubenswrapper[4786]: I0313 11:50:34.980032 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.000927 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.019350 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.025779 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.040447 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.051119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-config\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.060707 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.075166 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8q2gx"] Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.080939 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.084038 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db9f1421-b65b-4929-a789-41038fa70ea8-auth-proxy-config\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:35 crc kubenswrapper[4786]: W0313 11:50:35.092033 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3e1982_e5ea_40c7_b606_7b4464d32971.slice/crio-c986b9fffad44768268dde0d0c00c679d96aca70290e7a8cd74be887713de92a WatchSource:0}: Error finding container c986b9fffad44768268dde0d0c00c679d96aca70290e7a8cd74be887713de92a: Status 404 returned error can't find the container with id c986b9fffad44768268dde0d0c00c679d96aca70290e7a8cd74be887713de92a Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.099456 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.100322 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db9f1421-b65b-4929-a789-41038fa70ea8-config\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.119968 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.140368 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.152708 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db9f1421-b65b-4929-a789-41038fa70ea8-machine-approver-tls\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.159578 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.166111 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b8ca98-334c-438d-91ba-88b66fa36789-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.180037 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.199854 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.209919 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b8ca98-334c-438d-91ba-88b66fa36789-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.219730 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.239867 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.260175 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.280811 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.299799 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.320820 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.328179 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19edfa3c-b549-4376-b53d-d0f8a448bdec-serving-cert\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.340040 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.345413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19edfa3c-b549-4376-b53d-d0f8a448bdec-config\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.360102 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.379675 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.401052 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.420566 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.440817 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.472763 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.480832 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.500640 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.520330 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.541187 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.553704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb2b03d4-6c20-45cf-97a9-26ba24eff125-serving-cert\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.560001 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.570587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" event={"ID":"8a3e1982-e5ea-40c7-b606-7b4464d32971","Type":"ContainerStarted","Data":"6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e"} Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.570632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" event={"ID":"8a3e1982-e5ea-40c7-b606-7b4464d32971","Type":"ContainerStarted","Data":"c986b9fffad44768268dde0d0c00c679d96aca70290e7a8cd74be887713de92a"} Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.571195 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.579302 4786 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8q2gx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.579373 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" podUID="8a3e1982-e5ea-40c7-b606-7b4464d32971" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.581234 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.599916 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.608323 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-signing-key\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.621113 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.640370 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.650193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-signing-cabundle\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.659871 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.677900 4786 request.go:700] Waited for 1.018492522s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.680567 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.687550 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d94210f8-5f1d-4fa5-8954-14d18f8fa0e4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5h7bj\" (UID: \"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.700418 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.720237 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.733468 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d27bcc46-4fe6-47b9-8d99-9581308e512a-proxy-tls\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.740510 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.760753 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.767634 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e3d7e088-128a-4842-832e-c78fdfe99913-images\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.780492 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 11:50:35 crc kubenswrapper[4786]: E0313 11:50:35.793226 4786 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 11:50:35 crc kubenswrapper[4786]: E0313 11:50:35.793348 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d7e088-128a-4842-832e-c78fdfe99913-proxy-tls podName:e3d7e088-128a-4842-832e-c78fdfe99913 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:36.293320531 +0000 UTC m=+223.572973988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e3d7e088-128a-4842-832e-c78fdfe99913-proxy-tls") pod "machine-config-operator-74547568cd-d8crf" (UID: "e3d7e088-128a-4842-832e-c78fdfe99913") : failed to sync secret cache: timed out waiting for the condition Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.800126 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.820494 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.840589 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.871535 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.881510 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.901747 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.920123 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.940032 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.960410 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.985164 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 11:50:35 crc kubenswrapper[4786]: I0313 11:50:35.999867 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.050910 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpkz4\" (UniqueName: \"kubernetes.io/projected/1f2ac51c-103a-4e6c-806b-498c330fa36a-kube-api-access-lpkz4\") pod \"cluster-samples-operator-665b6dd947-z5l8m\" (UID: \"1f2ac51c-103a-4e6c-806b-498c330fa36a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.067132 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.067959 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wvn\" (UniqueName: \"kubernetes.io/projected/145932f5-ae28-4331-8bb2-a7fa535d7f96-kube-api-access-44wvn\") pod \"apiserver-7bbb656c7d-ftnwb\" (UID: \"145932f5-ae28-4331-8bb2-a7fa535d7f96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.088963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4967\" (UniqueName: \"kubernetes.io/projected/e090577d-dd68-4f18-b70a-836560c655ce-kube-api-access-n4967\") pod \"oauth-openshift-558db77b4-p7slx\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.148581 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.151548 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qk2r\" (UniqueName: \"kubernetes.io/projected/01831a9e-e080-4ede-905a-34277de02b46-kube-api-access-4qk2r\") pod \"machine-api-operator-5694c8668f-7tbp9\" (UID: \"01831a9e-e080-4ede-905a-34277de02b46\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.159521 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.180623 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.201605 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.214344 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.220690 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.228721 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.236185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.240102 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.261228 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.280972 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.300245 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.322275 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.332153 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m"] Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.341582 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.350763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d7e088-128a-4842-832e-c78fdfe99913-proxy-tls\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.360968 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d7e088-128a-4842-832e-c78fdfe99913-proxy-tls\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.361616 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.381671 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.400063 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.420204 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.440861 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.459750 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.468779 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7tbp9"] Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.480707 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.500655 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.508219 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7slx"] Mar 13 11:50:36 crc kubenswrapper[4786]: W0313 11:50:36.516002 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode090577d_dd68_4f18_b70a_836560c655ce.slice/crio-19f45b55749131f484204d730da515a461ef11f9652dc68d43ba30df2e0d7a11 WatchSource:0}: Error finding container 19f45b55749131f484204d730da515a461ef11f9652dc68d43ba30df2e0d7a11: Status 404 returned error can't find the container with id 19f45b55749131f484204d730da515a461ef11f9652dc68d43ba30df2e0d7a11 Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.519371 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.539676 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.559703 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.580004 4786 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.594065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" event={"ID":"1f2ac51c-103a-4e6c-806b-498c330fa36a","Type":"ContainerStarted","Data":"e07033ea74eb9efa9def50c3feab09bb3c1db8d4f1c795a097541e96c03d126e"} Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.594116 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" event={"ID":"1f2ac51c-103a-4e6c-806b-498c330fa36a","Type":"ContainerStarted","Data":"fd2a053e549963fca91822cc1c26dc57aa5982d714d85db170a92b83ca902900"} Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.595607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" event={"ID":"e090577d-dd68-4f18-b70a-836560c655ce","Type":"ContainerStarted","Data":"19f45b55749131f484204d730da515a461ef11f9652dc68d43ba30df2e0d7a11"} Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.598105 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" event={"ID":"01831a9e-e080-4ede-905a-34277de02b46","Type":"ContainerStarted","Data":"306b30653c17e0da9cec15c815a1b070cc2d7219602015547517fe6c2958dd44"} Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.598162 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" event={"ID":"01831a9e-e080-4ede-905a-34277de02b46","Type":"ContainerStarted","Data":"559a18c5d599c925676ec4905a92510782861fe45d82a85b56798c3e0b84e032"} Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.601799 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.603747 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.637027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmqh\" (UniqueName: \"kubernetes.io/projected/881358ff-be80-45e8-878e-445fcd8f8bda-kube-api-access-4gmqh\") pod \"openshift-apiserver-operator-796bbdcf4f-f66sl\" (UID: \"881358ff-be80-45e8-878e-445fcd8f8bda\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.654278 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxdlx\" (UniqueName: \"kubernetes.io/projected/e3d7e088-128a-4842-832e-c78fdfe99913-kube-api-access-wxdlx\") pod \"machine-config-operator-74547568cd-d8crf\" (UID: \"e3d7e088-128a-4842-832e-c78fdfe99913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.661739 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.664127 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.675372 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c989z\" (UniqueName: \"kubernetes.io/projected/8412032b-b4df-4687-9631-f8bb7da83696-kube-api-access-c989z\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.677933 4786 request.go:700] Waited for 1.883763022s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.694597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnd82\" (UniqueName: \"kubernetes.io/projected/b7a97bba-7000-4634-9cfe-efcc38685708-kube-api-access-vnd82\") pod \"downloads-7954f5f757-6rd6k\" (UID: \"b7a97bba-7000-4634-9cfe-efcc38685708\") " pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.719216 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdtjw\" (UniqueName: \"kubernetes.io/projected/d94210f8-5f1d-4fa5-8954-14d18f8fa0e4-kube-api-access-hdtjw\") pod \"control-plane-machine-set-operator-78cbb6b69f-5h7bj\" (UID: \"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.738200 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb"] Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.742844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhb5n\" (UniqueName: \"kubernetes.io/projected/19edfa3c-b549-4376-b53d-d0f8a448bdec-kube-api-access-dhb5n\") pod \"service-ca-operator-777779d784-wwmt6\" (UID: \"19edfa3c-b549-4376-b53d-d0f8a448bdec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.755271 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.757572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjxm\" (UniqueName: \"kubernetes.io/projected/b96faa7c-3975-4af4-8456-42ed7bbf9897-kube-api-access-rsjxm\") pod \"olm-operator-6b444d44fb-m72sv\" (UID: \"b96faa7c-3975-4af4-8456-42ed7bbf9897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.775422 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrlb\" (UniqueName: \"kubernetes.io/projected/21153583-f155-40cd-b0f0-1841cdb9c20d-kube-api-access-jsrlb\") pod \"apiserver-76f77b778f-pckgr\" (UID: \"21153583-f155-40cd-b0f0-1841cdb9c20d\") " pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.804491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77ch4\" (UniqueName: \"kubernetes.io/projected/5cc708e6-8514-40a4-965a-991f84a8f0d4-kube-api-access-77ch4\") pod \"openshift-controller-manager-operator-756b6f6bc6-vjl6f\" (UID: \"5cc708e6-8514-40a4-965a-991f84a8f0d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.808303 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.814961 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8412032b-b4df-4687-9631-f8bb7da83696-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ft9lw\" (UID: \"8412032b-b4df-4687-9631-f8bb7da83696\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.828670 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.843456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46p2f\" (UniqueName: \"kubernetes.io/projected/ed0ec184-b55e-474a-9e11-72957a85689d-kube-api-access-46p2f\") pod \"console-f9d7485db-nzss5\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.855912 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.880337 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.882287 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hbx\" (UniqueName: \"kubernetes.io/projected/d27bcc46-4fe6-47b9-8d99-9581308e512a-kube-api-access-49hbx\") pod \"machine-config-controller-84d6567774-5mjkr\" (UID: \"d27bcc46-4fe6-47b9-8d99-9581308e512a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.897205 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m96jn\" (UniqueName: \"kubernetes.io/projected/b09d2cd7-622a-41cb-be28-c6bd24ae1267-kube-api-access-m96jn\") pod \"etcd-operator-b45778765-c8g6q\" (UID: \"b09d2cd7-622a-41cb-be28-c6bd24ae1267\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.914702 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4ad78e0-02e4-4013-9f61-9f344b5e3f15-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hkcqf\" (UID: \"d4ad78e0-02e4-4013-9f61-9f344b5e3f15\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.922257 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.945936 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.949161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhtj\" (UniqueName: \"kubernetes.io/projected/db9f1421-b65b-4929-a789-41038fa70ea8-kube-api-access-2zhtj\") pod \"machine-approver-56656f9798-j6xnk\" (UID: \"db9f1421-b65b-4929-a789-41038fa70ea8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.952447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.954453 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rhm\" (UniqueName: \"kubernetes.io/projected/096982f0-2695-4021-9db9-a705d08903b0-kube-api-access-m6rhm\") pod \"dns-operator-744455d44c-cgkm9\" (UID: \"096982f0-2695-4021-9db9-a705d08903b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:36 crc kubenswrapper[4786]: I0313 11:50:36.974137 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.001896 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.012927 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qvd\" (UniqueName: \"kubernetes.io/projected/cb2b03d4-6c20-45cf-97a9-26ba24eff125-kube-api-access-98qvd\") pod \"openshift-config-operator-7777fb866f-j6thk\" (UID: \"cb2b03d4-6c20-45cf-97a9-26ba24eff125\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.017794 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.023579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4sb\" (UniqueName: \"kubernetes.io/projected/4fbcc49c-59c2-4eb3-9fa3-60c57280c8af-kube-api-access-5p4sb\") pod \"ingress-operator-5b745b69d9-6bvx4\" (UID: \"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:37 crc kubenswrapper[4786]: W0313 11:50:37.034825 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d7e088_128a_4842_832e_c78fdfe99913.slice/crio-43617bae491d8abe1466453a850db3914f862efe7f81a44233b46cfba34d4fa1 WatchSource:0}: Error finding container 43617bae491d8abe1466453a850db3914f862efe7f81a44233b46cfba34d4fa1: Status 404 returned error can't find the container with id 43617bae491d8abe1466453a850db3914f862efe7f81a44233b46cfba34d4fa1 Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.066868 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zs2h\" (UniqueName: \"kubernetes.io/projected/6a94b2e7-89b4-43f9-b7f6-3a433803914e-kube-api-access-9zs2h\") pod \"route-controller-manager-6576b87f9c-m9k26\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.068803 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.071604 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.071756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcpp2\" (UniqueName: \"kubernetes.io/projected/7aee0f31-69c3-4d4c-bc7b-c9a3123898f5-kube-api-access-dcpp2\") pod \"service-ca-9c57cc56f-vcflh\" (UID: \"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.101649 4786 projected.go:288] Couldn't get configMap openshift-authentication-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.101867 4786 projected.go:194] Error preparing data for projected volume kube-api-access-f6wx2 for pod openshift-authentication-operator/authentication-operator-69f744f599-76tgr: failed to sync configmap cache: timed out waiting for the condition Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.101931 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c3df92f-12d8-45f6-9369-c8b12f933e3c-kube-api-access-f6wx2 podName:1c3df92f-12d8-45f6-9369-c8b12f933e3c nodeName:}" failed. No retries permitted until 2026-03-13 11:50:37.601915606 +0000 UTC m=+224.881569043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f6wx2" (UniqueName: "kubernetes.io/projected/1c3df92f-12d8-45f6-9369-c8b12f933e3c-kube-api-access-f6wx2") pod "authentication-operator-69f744f599-76tgr" (UID: "1c3df92f-12d8-45f6-9369-c8b12f933e3c") : failed to sync configmap cache: timed out waiting for the condition Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.107979 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.108596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlcdc\" (UniqueName: \"kubernetes.io/projected/d7b8ca98-334c-438d-91ba-88b66fa36789-kube-api-access-zlcdc\") pod \"kube-storage-version-migrator-operator-b67b599dd-8sn6r\" (UID: \"d7b8ca98-334c-438d-91ba-88b66fa36789\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.110795 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8mb8\" (UniqueName: \"kubernetes.io/projected/f4eaf640-4e83-4528-ac9b-52a663fd5f05-kube-api-access-t8mb8\") pod \"router-default-5444994796-hk9g4\" (UID: \"f4eaf640-4e83-4528-ac9b-52a663fd5f05\") " pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.119522 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.131667 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.152646 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/525e850e-04a9-4dc1-91ab-a508136a5e60-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161244 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3c97d-e226-404e-814c-1d9a9c525ab2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161273 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-trusted-ca\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/344058e9-4126-475b-9108-e877dbb8201e-kube-api-access-rth2j\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161350 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/344058e9-4126-475b-9108-e877dbb8201e-serving-cert\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161380 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-certificates\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161421 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bf3c97d-e226-404e-814c-1d9a9c525ab2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161470 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxd2z\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-kube-api-access-xxd2z\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161500 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/344058e9-4126-475b-9108-e877dbb8201e-config\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/525e850e-04a9-4dc1-91ab-a508136a5e60-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-bound-sa-token\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161579 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-tls\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/344058e9-4126-475b-9108-e877dbb8201e-trusted-ca\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.161627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bf3c97d-e226-404e-814c-1d9a9c525ab2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.162074 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:37.662061479 +0000 UTC m=+224.941714926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.168358 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.189306 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.201572 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.231256 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.232851 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.239619 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.264852 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.267093 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:37.767062294 +0000 UTC m=+225.046715741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267192 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/525e850e-04a9-4dc1-91ab-a508136a5e60-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267233 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-bound-sa-token\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267257 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52da5870-8c41-475f-b1e8-3689fc3d43a6-config\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267299 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-tls\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtnp\" (UniqueName: \"kubernetes.io/projected/7eb9105b-0760-456d-a4ed-7ef4543a7967-kube-api-access-6jtnp\") pod \"migrator-59844c95c7-8dklp\" (UID: \"7eb9105b-0760-456d-a4ed-7ef4543a7967\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267343 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4675c625-0d2c-4358-9241-627d96dcb2f0-secret-volume\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267374 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkw7\" (UniqueName: \"kubernetes.io/projected/8f345a34-32f4-4c84-beb8-079212add522-kube-api-access-vmkw7\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9p8l\" (UniqueName: \"kubernetes.io/projected/4601292a-f734-479a-b592-178c0cefbea0-kube-api-access-b9p8l\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bf3c97d-e226-404e-814c-1d9a9c525ab2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267438 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/344058e9-4126-475b-9108-e877dbb8201e-trusted-ca\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267461 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/957d3fc1-041b-405a-a88a-22b3ad8ad95a-profile-collector-cert\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267514 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f345a34-32f4-4c84-beb8-079212add522-config-volume\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267538 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4601292a-f734-479a-b592-178c0cefbea0-certs\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4675c625-0d2c-4358-9241-627d96dcb2f0-config-volume\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-mountpoint-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267623 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/525e850e-04a9-4dc1-91ab-a508136a5e60-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267640 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-apiservice-cert\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267704 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52da5870-8c41-475f-b1e8-3689fc3d43a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267760 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn6c6\" (UniqueName: \"kubernetes.io/projected/2ac0a51e-aad8-4046-b8de-1e463db2b6b2-kube-api-access-tn6c6\") pod \"ingress-canary-2gm6m\" (UID: \"2ac0a51e-aad8-4046-b8de-1e463db2b6b2\") " pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267815 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3c97d-e226-404e-814c-1d9a9c525ab2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267845 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-trusted-ca\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267938 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f345a34-32f4-4c84-beb8-079212add522-metrics-tls\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267959 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.267993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/344058e9-4126-475b-9108-e877dbb8201e-kube-api-access-rth2j\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268015 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-registration-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268031 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a2c830be-ced3-4cf1-b515-de38f40e9418-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vw8v9\" (UID: \"a2c830be-ced3-4cf1-b515-de38f40e9418\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268046 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-socket-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/957d3fc1-041b-405a-a88a-22b3ad8ad95a-srv-cert\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268076 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4601292a-f734-479a-b592-178c0cefbea0-node-bootstrap-token\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268154 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5qp2\" (UniqueName: \"kubernetes.io/projected/01e610d5-a3b8-4fc8-a472-01ab5bb625d5-kube-api-access-q5qp2\") pod \"auto-csr-approver-29556710-bmgbc\" (UID: \"01e610d5-a3b8-4fc8-a472-01ab5bb625d5\") " pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268178 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd49d\" (UniqueName: \"kubernetes.io/projected/02e6f07f-826a-4784-bd45-87ee7984ad04-kube-api-access-rd49d\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268222 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/344058e9-4126-475b-9108-e877dbb8201e-serving-cert\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268250 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k7js\" (UniqueName: \"kubernetes.io/projected/4675c625-0d2c-4358-9241-627d96dcb2f0-kube-api-access-4k7js\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268289 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghnh2\" (UniqueName: \"kubernetes.io/projected/a2c830be-ced3-4cf1-b515-de38f40e9418-kube-api-access-ghnh2\") pod \"multus-admission-controller-857f4d67dd-vw8v9\" (UID: \"a2c830be-ced3-4cf1-b515-de38f40e9418\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268331 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-certificates\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ac0a51e-aad8-4046-b8de-1e463db2b6b2-cert\") pod \"ingress-canary-2gm6m\" (UID: \"2ac0a51e-aad8-4046-b8de-1e463db2b6b2\") " pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268366 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-tmpfs\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268381 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-webhook-cert\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268413 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bf3c97d-e226-404e-814c-1d9a9c525ab2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268430 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b90dd9c-139c-4527-ab2b-258022bca18a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8ssmw\" (UID: \"7b90dd9c-139c-4527-ab2b-258022bca18a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxd2z\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-kube-api-access-xxd2z\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268510 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkgc\" (UniqueName: \"kubernetes.io/projected/7b90dd9c-139c-4527-ab2b-258022bca18a-kube-api-access-nqkgc\") pod \"package-server-manager-789f6589d5-8ssmw\" (UID: \"7b90dd9c-139c-4527-ab2b-258022bca18a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268525 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52da5870-8c41-475f-b1e8-3689fc3d43a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/344058e9-4126-475b-9108-e877dbb8201e-config\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268587 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fkxr\" (UniqueName: \"kubernetes.io/projected/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-kube-api-access-9fkxr\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268624 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-plugins-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268640 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kw4\" (UniqueName: \"kubernetes.io/projected/957d3fc1-041b-405a-a88a-22b3ad8ad95a-kube-api-access-g5kw4\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknd4\" (UniqueName: \"kubernetes.io/projected/5c09ab49-3d49-495b-af13-5fd937259b53-kube-api-access-dknd4\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.268708 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-csi-data-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.269321 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/525e850e-04a9-4dc1-91ab-a508136a5e60-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.272824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/344058e9-4126-475b-9108-e877dbb8201e-trusted-ca\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.273612 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:37.773596794 +0000 UTC m=+225.053250241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.273871 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/344058e9-4126-475b-9108-e877dbb8201e-config\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.273997 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/344058e9-4126-475b-9108-e877dbb8201e-serving-cert\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.282015 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-trusted-ca\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.283792 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-certificates\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.284564 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-tls\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.285928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/525e850e-04a9-4dc1-91ab-a508136a5e60-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.286652 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bf3c97d-e226-404e-814c-1d9a9c525ab2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.303641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-bound-sa-token\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.303995 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/344058e9-4126-475b-9108-e877dbb8201e-kube-api-access-rth2j\") pod \"console-operator-58897d9998-d5hqz\" (UID: \"344058e9-4126-475b-9108-e877dbb8201e\") " pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.313641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bf3c97d-e226-404e-814c-1d9a9c525ab2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.330249 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxd2z\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-kube-api-access-xxd2z\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.347928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bf3c97d-e226-404e-814c-1d9a9c525ab2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k8t2r\" (UID: \"4bf3c97d-e226-404e-814c-1d9a9c525ab2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369322 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369494 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkgc\" (UniqueName: \"kubernetes.io/projected/7b90dd9c-139c-4527-ab2b-258022bca18a-kube-api-access-nqkgc\") pod \"package-server-manager-789f6589d5-8ssmw\" (UID: \"7b90dd9c-139c-4527-ab2b-258022bca18a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369530 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52da5870-8c41-475f-b1e8-3689fc3d43a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369558 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fkxr\" (UniqueName: \"kubernetes.io/projected/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-kube-api-access-9fkxr\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-plugins-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369615 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kw4\" (UniqueName: \"kubernetes.io/projected/957d3fc1-041b-405a-a88a-22b3ad8ad95a-kube-api-access-g5kw4\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369638 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dknd4\" (UniqueName: \"kubernetes.io/projected/5c09ab49-3d49-495b-af13-5fd937259b53-kube-api-access-dknd4\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369662 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-csi-data-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369689 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52da5870-8c41-475f-b1e8-3689fc3d43a6-config\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369714 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtnp\" (UniqueName: \"kubernetes.io/projected/7eb9105b-0760-456d-a4ed-7ef4543a7967-kube-api-access-6jtnp\") pod \"migrator-59844c95c7-8dklp\" (UID: \"7eb9105b-0760-456d-a4ed-7ef4543a7967\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4675c625-0d2c-4358-9241-627d96dcb2f0-secret-volume\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369773 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkw7\" (UniqueName: \"kubernetes.io/projected/8f345a34-32f4-4c84-beb8-079212add522-kube-api-access-vmkw7\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9p8l\" (UniqueName: \"kubernetes.io/projected/4601292a-f734-479a-b592-178c0cefbea0-kube-api-access-b9p8l\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369838 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/957d3fc1-041b-405a-a88a-22b3ad8ad95a-profile-collector-cert\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369870 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f345a34-32f4-4c84-beb8-079212add522-config-volume\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369914 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4601292a-f734-479a-b592-178c0cefbea0-certs\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369952 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4675c625-0d2c-4358-9241-627d96dcb2f0-config-volume\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369976 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-apiservice-cert\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.369999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-mountpoint-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52da5870-8c41-475f-b1e8-3689fc3d43a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370061 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn6c6\" (UniqueName: \"kubernetes.io/projected/2ac0a51e-aad8-4046-b8de-1e463db2b6b2-kube-api-access-tn6c6\") pod \"ingress-canary-2gm6m\" (UID: \"2ac0a51e-aad8-4046-b8de-1e463db2b6b2\") " pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f345a34-32f4-4c84-beb8-079212add522-metrics-tls\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370143 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-registration-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-socket-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/957d3fc1-041b-405a-a88a-22b3ad8ad95a-srv-cert\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370232 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a2c830be-ced3-4cf1-b515-de38f40e9418-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vw8v9\" (UID: \"a2c830be-ced3-4cf1-b515-de38f40e9418\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370256 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4601292a-f734-479a-b592-178c0cefbea0-node-bootstrap-token\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370279 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5qp2\" (UniqueName: \"kubernetes.io/projected/01e610d5-a3b8-4fc8-a472-01ab5bb625d5-kube-api-access-q5qp2\") pod \"auto-csr-approver-29556710-bmgbc\" (UID: \"01e610d5-a3b8-4fc8-a472-01ab5bb625d5\") " pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd49d\" (UniqueName: \"kubernetes.io/projected/02e6f07f-826a-4784-bd45-87ee7984ad04-kube-api-access-rd49d\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370330 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k7js\" (UniqueName: \"kubernetes.io/projected/4675c625-0d2c-4358-9241-627d96dcb2f0-kube-api-access-4k7js\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370364 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghnh2\" (UniqueName: \"kubernetes.io/projected/a2c830be-ced3-4cf1-b515-de38f40e9418-kube-api-access-ghnh2\") pod \"multus-admission-controller-857f4d67dd-vw8v9\" (UID: \"a2c830be-ced3-4cf1-b515-de38f40e9418\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-tmpfs\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370409 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-webhook-cert\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370432 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ac0a51e-aad8-4046-b8de-1e463db2b6b2-cert\") pod \"ingress-canary-2gm6m\" (UID: \"2ac0a51e-aad8-4046-b8de-1e463db2b6b2\") " pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.370468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b90dd9c-139c-4527-ab2b-258022bca18a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8ssmw\" (UID: \"7b90dd9c-139c-4527-ab2b-258022bca18a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.374524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b90dd9c-139c-4527-ab2b-258022bca18a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8ssmw\" (UID: \"7b90dd9c-139c-4527-ab2b-258022bca18a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.374636 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:37.874618291 +0000 UTC m=+225.154271738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.376590 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6rd6k"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.377055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-plugins-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.377444 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-csi-data-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.379927 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f345a34-32f4-4c84-beb8-079212add522-config-volume\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.381958 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-registration-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.382651 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4675c625-0d2c-4358-9241-627d96dcb2f0-secret-volume\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.382819 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4675c625-0d2c-4358-9241-627d96dcb2f0-config-volume\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.383597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-socket-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.384094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52da5870-8c41-475f-b1e8-3689fc3d43a6-config\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.384155 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/957d3fc1-041b-405a-a88a-22b3ad8ad95a-profile-collector-cert\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.385929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.386004 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/02e6f07f-826a-4784-bd45-87ee7984ad04-mountpoint-dir\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.386268 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-tmpfs\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.391823 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.394030 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pckgr"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.394089 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4601292a-f734-479a-b592-178c0cefbea0-node-bootstrap-token\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.394766 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.396542 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-apiservice-cert\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.396825 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4601292a-f734-479a-b592-178c0cefbea0-certs\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.397862 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a2c830be-ced3-4cf1-b515-de38f40e9418-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vw8v9\" (UID: \"a2c830be-ced3-4cf1-b515-de38f40e9418\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.398655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52da5870-8c41-475f-b1e8-3689fc3d43a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.399193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f345a34-32f4-4c84-beb8-079212add522-metrics-tls\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.400474 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/957d3fc1-041b-405a-a88a-22b3ad8ad95a-srv-cert\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.403538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ac0a51e-aad8-4046-b8de-1e463db2b6b2-cert\") pod \"ingress-canary-2gm6m\" (UID: \"2ac0a51e-aad8-4046-b8de-1e463db2b6b2\") " pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.406483 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-webhook-cert\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.425690 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkgc\" (UniqueName: \"kubernetes.io/projected/7b90dd9c-139c-4527-ab2b-258022bca18a-kube-api-access-nqkgc\") pod \"package-server-manager-789f6589d5-8ssmw\" (UID: \"7b90dd9c-139c-4527-ab2b-258022bca18a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.433015 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52da5870-8c41-475f-b1e8-3689fc3d43a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-md56c\" (UID: \"52da5870-8c41-475f-b1e8-3689fc3d43a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.439203 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fkxr\" (UniqueName: \"kubernetes.io/projected/0ab2a788-5f46-45b0-a4a4-43dbfceeddc9-kube-api-access-9fkxr\") pod \"packageserver-d55dfcdfc-557zt\" (UID: \"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.448344 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.457841 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kw4\" (UniqueName: \"kubernetes.io/projected/957d3fc1-041b-405a-a88a-22b3ad8ad95a-kube-api-access-g5kw4\") pod \"catalog-operator-68c6474976-kpxm6\" (UID: \"957d3fc1-041b-405a-a88a-22b3ad8ad95a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.472550 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.472933 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:37.972922022 +0000 UTC m=+225.252575469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.483067 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknd4\" (UniqueName: \"kubernetes.io/projected/5c09ab49-3d49-495b-af13-5fd937259b53-kube-api-access-dknd4\") pod \"marketplace-operator-79b997595-62rhz\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.517439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtnp\" (UniqueName: \"kubernetes.io/projected/7eb9105b-0760-456d-a4ed-7ef4543a7967-kube-api-access-6jtnp\") pod \"migrator-59844c95c7-8dklp\" (UID: \"7eb9105b-0760-456d-a4ed-7ef4543a7967\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.524592 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.529422 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkw7\" (UniqueName: \"kubernetes.io/projected/8f345a34-32f4-4c84-beb8-079212add522-kube-api-access-vmkw7\") pod \"dns-default-qbr8s\" (UID: \"8f345a34-32f4-4c84-beb8-079212add522\") " pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.551862 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9p8l\" (UniqueName: \"kubernetes.io/projected/4601292a-f734-479a-b592-178c0cefbea0-kube-api-access-b9p8l\") pod \"machine-config-server-hmxw2\" (UID: \"4601292a-f734-479a-b592-178c0cefbea0\") " pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.561607 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn6c6\" (UniqueName: \"kubernetes.io/projected/2ac0a51e-aad8-4046-b8de-1e463db2b6b2-kube-api-access-tn6c6\") pod \"ingress-canary-2gm6m\" (UID: \"2ac0a51e-aad8-4046-b8de-1e463db2b6b2\") " pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.576566 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.577456 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.577983 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.578286 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.078272607 +0000 UTC m=+225.357926054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.583514 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.594143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k7js\" (UniqueName: \"kubernetes.io/projected/4675c625-0d2c-4358-9241-627d96dcb2f0-kube-api-access-4k7js\") pod \"collect-profiles-29556705-gkzpf\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.604151 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.609588 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghnh2\" (UniqueName: \"kubernetes.io/projected/a2c830be-ced3-4cf1-b515-de38f40e9418-kube-api-access-ghnh2\") pod \"multus-admission-controller-857f4d67dd-vw8v9\" (UID: \"a2c830be-ced3-4cf1-b515-de38f40e9418\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.610340 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.633562 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.636787 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" event={"ID":"1f2ac51c-103a-4e6c-806b-498c330fa36a","Type":"ContainerStarted","Data":"d9d1c06e5c89f7b63b29b0561374367ae845af7b4556896c017a7213e4932355"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.640965 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.643776 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6rd6k" event={"ID":"b7a97bba-7000-4634-9cfe-efcc38685708","Type":"ContainerStarted","Data":"7af324d1eac9e8af96d580944eab9fde5f0d1ec5550ed667c5a3b8e27163e09a"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.646814 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.653479 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gm6m" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.654629 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" event={"ID":"e3d7e088-128a-4842-832e-c78fdfe99913","Type":"ContainerStarted","Data":"261c67947a1bdab43c4886454b8818585fbc6f697290a9ce7eb28cce2dff523f"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.654806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" event={"ID":"e3d7e088-128a-4842-832e-c78fdfe99913","Type":"ContainerStarted","Data":"43617bae491d8abe1466453a850db3914f862efe7f81a44233b46cfba34d4fa1"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.657211 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" event={"ID":"881358ff-be80-45e8-878e-445fcd8f8bda","Type":"ContainerStarted","Data":"7e84304259dc9560217203221d9f8859863df14223acabd35f2592e11e548248"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.674689 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hmxw2" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.683803 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6wx2\" (UniqueName: \"kubernetes.io/projected/1c3df92f-12d8-45f6-9369-c8b12f933e3c-kube-api-access-f6wx2\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.683923 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.684198 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.184188319 +0000 UTC m=+225.463841766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.686952 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5qp2\" (UniqueName: \"kubernetes.io/projected/01e610d5-a3b8-4fc8-a472-01ab5bb625d5-kube-api-access-q5qp2\") pod \"auto-csr-approver-29556710-bmgbc\" (UID: \"01e610d5-a3b8-4fc8-a472-01ab5bb625d5\") " pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.692728 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd49d\" (UniqueName: \"kubernetes.io/projected/02e6f07f-826a-4784-bd45-87ee7984ad04-kube-api-access-rd49d\") pod \"csi-hostpathplugin-fgsg7\" (UID: \"02e6f07f-826a-4784-bd45-87ee7984ad04\") " pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.696383 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" event={"ID":"01831a9e-e080-4ede-905a-34277de02b46","Type":"ContainerStarted","Data":"bf3225a2428cdd0b2de7205564663391dda7029e3d098b27282a17561c9f50e1"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.725799 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hk9g4" event={"ID":"f4eaf640-4e83-4528-ac9b-52a663fd5f05","Type":"ContainerStarted","Data":"006729fa000e23c6a6d44bef55d09a29f6d303d2590d6041e70cff7b17b13737"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.733480 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6wx2\" (UniqueName: \"kubernetes.io/projected/1c3df92f-12d8-45f6-9369-c8b12f933e3c-kube-api-access-f6wx2\") pod \"authentication-operator-69f744f599-76tgr\" (UID: \"1c3df92f-12d8-45f6-9369-c8b12f933e3c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.736068 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.756409 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nzss5"] Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.759340 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" event={"ID":"8412032b-b4df-4687-9631-f8bb7da83696","Type":"ContainerStarted","Data":"fe79f195100b5a3ee4605b4c756cab5e6945752fd340df50ba5e76a56125378d"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.760986 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" event={"ID":"db9f1421-b65b-4929-a789-41038fa70ea8","Type":"ContainerStarted","Data":"b4c25f88b934b0a0e5e59854edb7f3145761c67ea25776907f5d3022525d2678"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.765416 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" event={"ID":"e090577d-dd68-4f18-b70a-836560c655ce","Type":"ContainerStarted","Data":"ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.765815 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.766368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" event={"ID":"21153583-f155-40cd-b0f0-1841cdb9c20d","Type":"ContainerStarted","Data":"0f0f545ac2032aef1669ace329310ec0023a9f1fa44773dbb64e4765e8ad879c"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.768861 4786 generic.go:334] "Generic (PLEG): container finished" podID="145932f5-ae28-4331-8bb2-a7fa535d7f96" containerID="3375fd2c350b11d236b7a80e7b2ac1ebbccbe3d2bcb970bd585cd7a62b908376" exitCode=0 Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.769212 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" event={"ID":"145932f5-ae28-4331-8bb2-a7fa535d7f96","Type":"ContainerDied","Data":"3375fd2c350b11d236b7a80e7b2ac1ebbccbe3d2bcb970bd585cd7a62b908376"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.769256 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" event={"ID":"145932f5-ae28-4331-8bb2-a7fa535d7f96","Type":"ContainerStarted","Data":"758b346d0b99e6b804f609c5ea848439d2a1fa376ce5e0abb3ac59e7e8a79c0e"} Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.785479 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.786319 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.286303305 +0000 UTC m=+225.565956752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.886722 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.887825 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.387807824 +0000 UTC m=+225.667461271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.897142 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.925598 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.967277 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.990372 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:37 crc kubenswrapper[4786]: E0313 11:50:37.990963 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.490948319 +0000 UTC m=+225.770601766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:37 crc kubenswrapper[4786]: I0313 11:50:37.998134 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.092743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.093116 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.593105887 +0000 UTC m=+225.872759334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.171108 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.171154 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.193702 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.194489 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.694466723 +0000 UTC m=+225.974120170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.296329 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.296659 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.796645741 +0000 UTC m=+226.076299178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.340822 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8g6q"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.364811 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.387754 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.387799 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.387809 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.396825 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.397192 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:38.897179324 +0000 UTC m=+226.176832771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: W0313 11:50:38.411510 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc708e6_8514_40a4_965a_991f84a8f0d4.slice/crio-c03a5e4b29825f3214f90eb49a014147c783ff7f48765251435bf91901a4745c WatchSource:0}: Error finding container c03a5e4b29825f3214f90eb49a014147c783ff7f48765251435bf91901a4745c: Status 404 returned error can't find the container with id c03a5e4b29825f3214f90eb49a014147c783ff7f48765251435bf91901a4745c Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.424357 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.505189 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.505815 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.0058031 +0000 UTC m=+226.285456547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.542933 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.589242 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cgkm9"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.606252 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.606579 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.106553109 +0000 UTC m=+226.386206556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: W0313 11:50:38.631850 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd94210f8_5f1d_4fa5_8954_14d18f8fa0e4.slice/crio-de2aa29e340c8e91e24ae9a4aa9d3073a2f5e2e77c26cc8e3e1402f3586f283b WatchSource:0}: Error finding container de2aa29e340c8e91e24ae9a4aa9d3073a2f5e2e77c26cc8e3e1402f3586f283b: Status 404 returned error can't find the container with id de2aa29e340c8e91e24ae9a4aa9d3073a2f5e2e77c26cc8e3e1402f3586f283b Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.657041 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.664739 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j6thk"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.677751 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4"] Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.706088 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" podStartSLOduration=165.706070504 podStartE2EDuration="2m45.706070504s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:38.636625765 +0000 UTC m=+225.916279242" watchObservedRunningTime="2026-03-13 11:50:38.706070504 +0000 UTC m=+225.985723951" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.707504 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.708717 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.208701176 +0000 UTC m=+226.488354663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.777671 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" event={"ID":"5cc708e6-8514-40a4-965a-991f84a8f0d4","Type":"ContainerStarted","Data":"c03a5e4b29825f3214f90eb49a014147c783ff7f48765251435bf91901a4745c"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.778626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" event={"ID":"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4","Type":"ContainerStarted","Data":"de2aa29e340c8e91e24ae9a4aa9d3073a2f5e2e77c26cc8e3e1402f3586f283b"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.780406 4786 generic.go:334] "Generic (PLEG): container finished" podID="21153583-f155-40cd-b0f0-1841cdb9c20d" containerID="3144390a19ec37ba938841d4ae0e2ca5b0de2418d872270f7c30498e073ba951" exitCode=0 Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.780472 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" event={"ID":"21153583-f155-40cd-b0f0-1841cdb9c20d","Type":"ContainerDied","Data":"3144390a19ec37ba938841d4ae0e2ca5b0de2418d872270f7c30498e073ba951"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.803256 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" event={"ID":"8412032b-b4df-4687-9631-f8bb7da83696","Type":"ContainerStarted","Data":"e0761ce960916d1a91766e0563777afa238eebc68aef130996cc4f05051c8a86"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.808564 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" event={"ID":"6a94b2e7-89b4-43f9-b7f6-3a433803914e","Type":"ContainerStarted","Data":"63b1e5b629336b9097e5fda385b3fbd93842b96649027933c65125a66b8dea49"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.808816 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.808995 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.308967711 +0000 UTC m=+226.588621158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.809082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.809514 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.309505847 +0000 UTC m=+226.589159294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.812082 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" event={"ID":"e3d7e088-128a-4842-832e-c78fdfe99913","Type":"ContainerStarted","Data":"bf698920b148bfc818529b7b414a6dcde61c9a4f70919dabd28319cc89cb0ddd"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.816743 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hmxw2" event={"ID":"4601292a-f734-479a-b592-178c0cefbea0","Type":"ContainerStarted","Data":"5fc8044b216b7631195d1b0ceeee0a3bf67230ab75df185622eddd9c0eb53037"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.816841 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hmxw2" event={"ID":"4601292a-f734-479a-b592-178c0cefbea0","Type":"ContainerStarted","Data":"3f45d4304bc714e9e176a5cc9890009bf4917fe936b65abdff9329123785746e"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.826241 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" event={"ID":"db9f1421-b65b-4929-a789-41038fa70ea8","Type":"ContainerStarted","Data":"29ea7391ba0f2e8243decd8d7abcc773522aca01e10fd84dea3d17618d7e0431"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.829441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" event={"ID":"b96faa7c-3975-4af4-8456-42ed7bbf9897","Type":"ContainerStarted","Data":"e23eb0ccb0630c38408253cf59bfebbb0f31a1fdae148386819df9c190a60bbf"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.829469 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" event={"ID":"b96faa7c-3975-4af4-8456-42ed7bbf9897","Type":"ContainerStarted","Data":"e0f8605c6f47f2ee50a8ff64061ee018317d988f1f55f21a9cc31a9bea31bd32"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.829820 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.831763 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" event={"ID":"b09d2cd7-622a-41cb-be28-c6bd24ae1267","Type":"ContainerStarted","Data":"2a6691ec4bdca2e30eace93ef7fea5b34e23842fd79cf0bfdec7222c7ff09621"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.833915 4786 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m72sv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.833973 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" podUID="b96faa7c-3975-4af4-8456-42ed7bbf9897" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 13 11:50:38 crc kubenswrapper[4786]: W0313 11:50:38.836847 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod096982f0_2695_4021_9db9_a705d08903b0.slice/crio-da97eae631861b83f38258e5921166423c9da6dac18434a85cfe622738434ee2 WatchSource:0}: Error finding container da97eae631861b83f38258e5921166423c9da6dac18434a85cfe622738434ee2: Status 404 returned error can't find the container with id da97eae631861b83f38258e5921166423c9da6dac18434a85cfe622738434ee2 Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.843119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" event={"ID":"19edfa3c-b549-4376-b53d-d0f8a448bdec","Type":"ContainerStarted","Data":"d966b913ee083cbc71b9e8bc93204ebe44870f900242180e6b8d79be7b016aea"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.856920 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6rd6k" event={"ID":"b7a97bba-7000-4634-9cfe-efcc38685708","Type":"ContainerStarted","Data":"8f74b38c197665798865914bfdc07ea2cac1d2c144e56c6397be3525b28bb729"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.858451 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.863409 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nzss5" event={"ID":"ed0ec184-b55e-474a-9e11-72957a85689d","Type":"ContainerStarted","Data":"082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.863446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nzss5" event={"ID":"ed0ec184-b55e-474a-9e11-72957a85689d","Type":"ContainerStarted","Data":"fe5d9ef7255e9e108a783b4a0bfb3f12716e9a99ab2c438e1fe8fc9db78b2bc9"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.870457 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-6rd6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.870550 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6rd6k" podUID="b7a97bba-7000-4634-9cfe-efcc38685708" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 13 11:50:38 crc kubenswrapper[4786]: W0313 11:50:38.893291 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fbcc49c_59c2_4eb3_9fa3_60c57280c8af.slice/crio-501988055d46e7becc878f05941fc29becc74c5d3b46139cfc987d1aceed4c13 WatchSource:0}: Error finding container 501988055d46e7becc878f05941fc29becc74c5d3b46139cfc987d1aceed4c13: Status 404 returned error can't find the container with id 501988055d46e7becc878f05941fc29becc74c5d3b46139cfc987d1aceed4c13 Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.900711 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" event={"ID":"145932f5-ae28-4331-8bb2-a7fa535d7f96","Type":"ContainerStarted","Data":"0d7bef4b553250f18f01d0f012307c3e7c94d6bd293b19a85923fce526e3aef2"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.905541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" event={"ID":"881358ff-be80-45e8-878e-445fcd8f8bda","Type":"ContainerStarted","Data":"fb1af267ae4b2f532f71718dc1c370c146048279c906f6eb5cf7500d67aafaa2"} Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.913475 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.914408 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hk9g4" event={"ID":"f4eaf640-4e83-4528-ac9b-52a663fd5f05","Type":"ContainerStarted","Data":"62d804d1b71bfb7d0f4c0a89bc3f5d2689729a3a3822220c01af85ebbbde5f79"} Mar 13 11:50:38 crc kubenswrapper[4786]: E0313 11:50:38.914940 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.414912743 +0000 UTC m=+226.694566190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:38 crc kubenswrapper[4786]: I0313 11:50:38.916471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" event={"ID":"d27bcc46-4fe6-47b9-8d99-9581308e512a","Type":"ContainerStarted","Data":"a8d4f0cdbad6a2771d3249ade194e5d0c602805826e808bf41c8a5dd79c77c07"} Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.003289 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z5l8m" podStartSLOduration=166.003271042 podStartE2EDuration="2m46.003271042s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:38.962151522 +0000 UTC m=+226.241804969" watchObservedRunningTime="2026-03-13 11:50:39.003271042 +0000 UTC m=+226.282924499" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.015799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.021650 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.521636526 +0000 UTC m=+226.801289973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.117687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.118455 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.618441457 +0000 UTC m=+226.898094904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.170057 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.221828 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.222159 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.722147858 +0000 UTC m=+227.001801305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.223720 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.230814 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vcflh"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.252070 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:39 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:39 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:39 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.252108 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.281190 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.289984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.297800 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nzss5" podStartSLOduration=166.297782426 podStartE2EDuration="2m46.297782426s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.245812887 +0000 UTC m=+226.525466334" watchObservedRunningTime="2026-03-13 11:50:39.297782426 +0000 UTC m=+226.577435873" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.325371 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d8crf" podStartSLOduration=166.325339473 podStartE2EDuration="2m46.325339473s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.288120651 +0000 UTC m=+226.567774108" watchObservedRunningTime="2026-03-13 11:50:39.325339473 +0000 UTC m=+226.604992920" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.326934 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.327555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.327851 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.827838762 +0000 UTC m=+227.107492199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.336658 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52da5870_8c41_475f_b1e8_3689fc3d43a6.slice/crio-a7c7b580965db3319ff5e8a842eb649bec5cc874f3c194ebae4550b602c40ebc WatchSource:0}: Error finding container a7c7b580965db3319ff5e8a842eb649bec5cc874f3c194ebae4550b602c40ebc: Status 404 returned error can't find the container with id a7c7b580965db3319ff5e8a842eb649bec5cc874f3c194ebae4550b602c40ebc Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.340229 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.356355 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf"] Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.361610 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957d3fc1_041b_405a_a88a_22b3ad8ad95a.slice/crio-760f86a69eb46e7e256dbe7d0e5663fea39931ac9567974aee8fccfe17b6e725 WatchSource:0}: Error finding container 760f86a69eb46e7e256dbe7d0e5663fea39931ac9567974aee8fccfe17b6e725: Status 404 returned error can't find the container with id 760f86a69eb46e7e256dbe7d0e5663fea39931ac9567974aee8fccfe17b6e725 Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.375681 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-62rhz"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.376697 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d5hqz"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.386856 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fgsg7"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.387173 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ft9lw" podStartSLOduration=166.387156872 podStartE2EDuration="2m46.387156872s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.360599432 +0000 UTC m=+226.640252879" watchObservedRunningTime="2026-03-13 11:50:39.387156872 +0000 UTC m=+226.666810319" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.393801 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f66sl" podStartSLOduration=166.393787364 podStartE2EDuration="2m46.393787364s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.39290909 +0000 UTC m=+226.672562547" watchObservedRunningTime="2026-03-13 11:50:39.393787364 +0000 UTC m=+226.673440811" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.428737 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.434956 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hmxw2" podStartSLOduration=5.434933035 podStartE2EDuration="5.434933035s" podCreationTimestamp="2026-03-13 11:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.430074402 +0000 UTC m=+226.709727859" watchObservedRunningTime="2026-03-13 11:50:39.434933035 +0000 UTC m=+226.714586482" Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.435219 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.931803639 +0000 UTC m=+227.211457086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.518689 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" podStartSLOduration=165.518673176 podStartE2EDuration="2m45.518673176s" podCreationTimestamp="2026-03-13 11:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.480016935 +0000 UTC m=+226.759670382" watchObservedRunningTime="2026-03-13 11:50:39.518673176 +0000 UTC m=+226.798326643" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.519902 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6rd6k" podStartSLOduration=166.51989644 podStartE2EDuration="2m46.51989644s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.517611778 +0000 UTC m=+226.797265245" watchObservedRunningTime="2026-03-13 11:50:39.51989644 +0000 UTC m=+226.799549887" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.529808 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.530098 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.03008621 +0000 UTC m=+227.309739657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.532276 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qbr8s"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.564668 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hk9g4" podStartSLOduration=166.56465037 podStartE2EDuration="2m46.56465037s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.563755585 +0000 UTC m=+226.843409022" watchObservedRunningTime="2026-03-13 11:50:39.56465037 +0000 UTC m=+226.844303817" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.577824 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.604311 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" podStartSLOduration=166.604290829 podStartE2EDuration="2m46.604290829s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.599178569 +0000 UTC m=+226.878832036" watchObservedRunningTime="2026-03-13 11:50:39.604290829 +0000 UTC m=+226.883944276" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.635037 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" podStartSLOduration=166.635020504 podStartE2EDuration="2m46.635020504s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.633559824 +0000 UTC m=+226.913213291" watchObservedRunningTime="2026-03-13 11:50:39.635020504 +0000 UTC m=+226.914673961" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.646996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.647467 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.147453776 +0000 UTC m=+227.427107223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.719084 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7tbp9" podStartSLOduration=166.719065264 podStartE2EDuration="2m46.719065264s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:39.714274573 +0000 UTC m=+226.993928030" watchObservedRunningTime="2026-03-13 11:50:39.719065264 +0000 UTC m=+226.998718711" Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.744356 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2gm6m"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.744400 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.748704 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.748978 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.248965646 +0000 UTC m=+227.528619093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.765525 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.765776 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-bmgbc"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.775458 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vw8v9"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.790078 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-76tgr"] Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.850548 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.850860 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.350846076 +0000 UTC m=+227.630499523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.877377 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac0a51e_aad8_4046_b8de_1e463db2b6b2.slice/crio-5576fc6fceae2eecb4caf476a56d036c10b550db8571732f9a2b5f361bbcf048 WatchSource:0}: Error finding container 5576fc6fceae2eecb4caf476a56d036c10b550db8571732f9a2b5f361bbcf048: Status 404 returned error can't find the container with id 5576fc6fceae2eecb4caf476a56d036c10b550db8571732f9a2b5f361bbcf048 Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.897709 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c830be_ced3_4cf1_b515_de38f40e9418.slice/crio-7a1574ae806fe37d8c06d691bab1f3b71884a7426c486de24948adcc8d7a37b6 WatchSource:0}: Error finding container 7a1574ae806fe37d8c06d691bab1f3b71884a7426c486de24948adcc8d7a37b6: Status 404 returned error can't find the container with id 7a1574ae806fe37d8c06d691bab1f3b71884a7426c486de24948adcc8d7a37b6 Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.915497 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e610d5_a3b8_4fc8_a472_01ab5bb625d5.slice/crio-68826c95e2bbf9fff2963f8ce2b41ff813e94f66dfd2232b9a4b5d7c3dbbebfc WatchSource:0}: Error finding container 68826c95e2bbf9fff2963f8ce2b41ff813e94f66dfd2232b9a4b5d7c3dbbebfc: Status 404 returned error can't find the container with id 68826c95e2bbf9fff2963f8ce2b41ff813e94f66dfd2232b9a4b5d7c3dbbebfc Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.926553 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.937405 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c3df92f_12d8_45f6_9369_c8b12f933e3c.slice/crio-f2e1b6b44a2ec4f2c48b198d97043f92eba5cd5703ad470785af44d11dd60730 WatchSource:0}: Error finding container f2e1b6b44a2ec4f2c48b198d97043f92eba5cd5703ad470785af44d11dd60730: Status 404 returned error can't find the container with id f2e1b6b44a2ec4f2c48b198d97043f92eba5cd5703ad470785af44d11dd60730 Mar 13 11:50:39 crc kubenswrapper[4786]: I0313 11:50:39.951460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4786]: E0313 11:50:39.951991 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.451977156 +0000 UTC m=+227.731630603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4786]: W0313 11:50:39.960586 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab2a788_5f46_45b0_a4a4_43dbfceeddc9.slice/crio-8ea589db966bcf04510405df474c45d345b964a8b5ff106b3dd3fa230bfc15a1 WatchSource:0}: Error finding container 8ea589db966bcf04510405df474c45d345b964a8b5ff106b3dd3fa230bfc15a1: Status 404 returned error can't find the container with id 8ea589db966bcf04510405df474c45d345b964a8b5ff106b3dd3fa230bfc15a1 Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.004776 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" event={"ID":"7eb9105b-0760-456d-a4ed-7ef4543a7967","Type":"ContainerStarted","Data":"1edb04e935611e2d2aff52759854006f28fa1d433d37143a15e3348d5935d366"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.022324 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" event={"ID":"d7b8ca98-334c-438d-91ba-88b66fa36789","Type":"ContainerStarted","Data":"746b49b7a8953a5be7768830b1af70d5ddc5b314d47f8b55e9b5c952814656a0"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.022373 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" event={"ID":"d7b8ca98-334c-438d-91ba-88b66fa36789","Type":"ContainerStarted","Data":"3659486b910709560cf245d018f01b2d27d28d074658240cd9b4a91cb0761288"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.029891 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" event={"ID":"a2c830be-ced3-4cf1-b515-de38f40e9418","Type":"ContainerStarted","Data":"7a1574ae806fe37d8c06d691bab1f3b71884a7426c486de24948adcc8d7a37b6"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.050627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" event={"ID":"02e6f07f-826a-4784-bd45-87ee7984ad04","Type":"ContainerStarted","Data":"131e4cbe0e4045d023ea1e71ab61489b55fc5420536036a4503ff8ae46c1a195"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.058066 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8sn6r" podStartSLOduration=167.058047611 podStartE2EDuration="2m47.058047611s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.049568697 +0000 UTC m=+227.329222134" watchObservedRunningTime="2026-03-13 11:50:40.058047611 +0000 UTC m=+227.337701058" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.060296 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.061205 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.561193387 +0000 UTC m=+227.840846824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.114827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" event={"ID":"4675c625-0d2c-4358-9241-627d96dcb2f0","Type":"ContainerStarted","Data":"56c05258f4546d6e676a68dfb301418f6e7f8fcce3cd8dcf8438a2b735e7eb25"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.114863 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" event={"ID":"4675c625-0d2c-4358-9241-627d96dcb2f0","Type":"ContainerStarted","Data":"f8e659878e65e3880486e67991658b742f27c88616b241ce98a0fd07812881db"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.120530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" event={"ID":"d94210f8-5f1d-4fa5-8954-14d18f8fa0e4","Type":"ContainerStarted","Data":"4c7b8520183c89f74917147f8183d0cb50d74771183aa82861a25fb3fb248237"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.124443 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" event={"ID":"db9f1421-b65b-4929-a789-41038fa70ea8","Type":"ContainerStarted","Data":"66ed9032f78c56e5b35559d9f5ce2a522da021a9198e8ee09b92c4acad6cc8cd"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.126831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" event={"ID":"6a94b2e7-89b4-43f9-b7f6-3a433803914e","Type":"ContainerStarted","Data":"99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.127396 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.128397 4786 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-m9k26 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.128427 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" podUID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.153785 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" event={"ID":"21153583-f155-40cd-b0f0-1841cdb9c20d","Type":"ContainerStarted","Data":"83f6e475b361561daed8e3cadb68c6c3ca23d760834612ff23bd5a4c97216ea6"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.159869 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" event={"ID":"4bf3c97d-e226-404e-814c-1d9a9c525ab2","Type":"ContainerStarted","Data":"fdf1f7048f39baf8354358a2d46613f1800bb29c6d28ccffb879a88ae84b1326"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.161344 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.161520 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.661503913 +0000 UTC m=+227.941157360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.161603 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.163305 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.663277583 +0000 UTC m=+227.942931030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.171271 4786 generic.go:334] "Generic (PLEG): container finished" podID="cb2b03d4-6c20-45cf-97a9-26ba24eff125" containerID="a6aa3f5c97db2984b883923aab579b78fac3e0b45f4fbf649cf0ffddca0cd736" exitCode=0 Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.171368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" event={"ID":"cb2b03d4-6c20-45cf-97a9-26ba24eff125","Type":"ContainerDied","Data":"a6aa3f5c97db2984b883923aab579b78fac3e0b45f4fbf649cf0ffddca0cd736"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.171397 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" event={"ID":"cb2b03d4-6c20-45cf-97a9-26ba24eff125","Type":"ContainerStarted","Data":"d028ba9eaf0cc5e8f9dd48a39385fbad2b92c4fbb9da072f9da5b3856e5e316d"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.178100 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j6xnk" podStartSLOduration=167.178082229 podStartE2EDuration="2m47.178082229s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.177311628 +0000 UTC m=+227.456965075" watchObservedRunningTime="2026-03-13 11:50:40.178082229 +0000 UTC m=+227.457735676" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.178353 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" podStartSLOduration=167.178344956 podStartE2EDuration="2m47.178344956s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.144308741 +0000 UTC m=+227.423962188" watchObservedRunningTime="2026-03-13 11:50:40.178344956 +0000 UTC m=+227.457998423" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.178815 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:40 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:40 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:40 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.178852 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.186701 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" event={"ID":"19edfa3c-b549-4376-b53d-d0f8a448bdec","Type":"ContainerStarted","Data":"05cd5ed19d58319a3215a96c955b90975814f393c1a45d2cae43eb83028ea6e2"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.189193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" event={"ID":"7b90dd9c-139c-4527-ab2b-258022bca18a","Type":"ContainerStarted","Data":"3b68b607773c1e09c5a5953a27b5b3c18dbc66ce1e94c80a8d64fcd062783966"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.200189 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" podStartSLOduration=166.200171426 podStartE2EDuration="2m46.200171426s" podCreationTimestamp="2026-03-13 11:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.197308508 +0000 UTC m=+227.476961965" watchObservedRunningTime="2026-03-13 11:50:40.200171426 +0000 UTC m=+227.479824873" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.211949 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" event={"ID":"d27bcc46-4fe6-47b9-8d99-9581308e512a","Type":"ContainerStarted","Data":"826d758c250558c4fc71f3337ba08113477a651722b18642200545df507ac364"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.212171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" event={"ID":"d27bcc46-4fe6-47b9-8d99-9581308e512a","Type":"ContainerStarted","Data":"5dc93c31cf64ab7c8cb87f9797ce66c2caeae2ad6640242cd91760a42724cb8d"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.218821 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" event={"ID":"344058e9-4126-475b-9108-e877dbb8201e","Type":"ContainerStarted","Data":"d26c13ba7cf2a93d092282e1c5d05cecec36ea0948be7a99ecae8be4ad1b198a"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.219615 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.221376 4786 patch_prober.go:28] interesting pod/console-operator-58897d9998-d5hqz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.221420 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" podUID="344058e9-4126-475b-9108-e877dbb8201e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.224316 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5h7bj" podStartSLOduration=167.224299279 podStartE2EDuration="2m47.224299279s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.216541647 +0000 UTC m=+227.496195094" watchObservedRunningTime="2026-03-13 11:50:40.224299279 +0000 UTC m=+227.503952726" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.241770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" event={"ID":"5c09ab49-3d49-495b-af13-5fd937259b53","Type":"ContainerStarted","Data":"e6aacd243e897fa7a5b3d743fc9a245fb30e96f1e7b8189810ee944c40028af0"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.242572 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.254032 4786 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-62rhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.254089 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.254562 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbr8s" event={"ID":"8f345a34-32f4-4c84-beb8-079212add522","Type":"ContainerStarted","Data":"342655ef34af86656190f88a29f4137a9df31eb9c6324135c982eb3bfa7dfd6f"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.260016 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wwmt6" podStartSLOduration=166.260002661 podStartE2EDuration="2m46.260002661s" podCreationTimestamp="2026-03-13 11:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.258716856 +0000 UTC m=+227.538370313" watchObservedRunningTime="2026-03-13 11:50:40.260002661 +0000 UTC m=+227.539656108" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.262726 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.263654 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.76362854 +0000 UTC m=+228.043281987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.292176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" event={"ID":"5cc708e6-8514-40a4-965a-991f84a8f0d4","Type":"ContainerStarted","Data":"60b422260c85fa38455bc46485e4992b1766839a4827a2e1211714b610f39208"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.300153 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" event={"ID":"096982f0-2695-4021-9db9-a705d08903b0","Type":"ContainerStarted","Data":"e8f16e8e1e9d6d9f1dec57f534aafd4a79bfed52c0e35a8eecec5be7e5460209"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.300210 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" event={"ID":"096982f0-2695-4021-9db9-a705d08903b0","Type":"ContainerStarted","Data":"da97eae631861b83f38258e5921166423c9da6dac18434a85cfe622738434ee2"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.319496 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mjkr" podStartSLOduration=167.319476465 podStartE2EDuration="2m47.319476465s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.279390334 +0000 UTC m=+227.559043781" watchObservedRunningTime="2026-03-13 11:50:40.319476465 +0000 UTC m=+227.599129912" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.321085 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2gm6m" event={"ID":"2ac0a51e-aad8-4046-b8de-1e463db2b6b2","Type":"ContainerStarted","Data":"5576fc6fceae2eecb4caf476a56d036c10b550db8571732f9a2b5f361bbcf048"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.332351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" event={"ID":"957d3fc1-041b-405a-a88a-22b3ad8ad95a","Type":"ContainerStarted","Data":"760f86a69eb46e7e256dbe7d0e5663fea39931ac9567974aee8fccfe17b6e725"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.333080 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.334734 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" event={"ID":"b09d2cd7-622a-41cb-be28-c6bd24ae1267","Type":"ContainerStarted","Data":"f59c00e6fe4847bf58a8e355a2c3c31ee88461519a399f938ab2229e6e713394"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.337533 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" podStartSLOduration=167.337519591 podStartE2EDuration="2m47.337519591s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.30836411 +0000 UTC m=+227.588017557" watchObservedRunningTime="2026-03-13 11:50:40.337519591 +0000 UTC m=+227.617173038" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.352547 4786 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-kpxm6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.352599 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" podUID="957d3fc1-041b-405a-a88a-22b3ad8ad95a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.356076 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" event={"ID":"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af","Type":"ContainerStarted","Data":"f87d1b0f52b2c045668bb54cff3aa05ecd85e81383de778a1f549163a58d673b"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.356126 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" event={"ID":"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af","Type":"ContainerStarted","Data":"669cfbda44ea52658734801246fb29086de11c383b4f231cbd82c1d65d88bc75"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.356136 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" event={"ID":"4fbcc49c-59c2-4eb3-9fa3-60c57280c8af","Type":"ContainerStarted","Data":"501988055d46e7becc878f05941fc29becc74c5d3b46139cfc987d1aceed4c13"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.367429 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" event={"ID":"d4ad78e0-02e4-4013-9f61-9f344b5e3f15","Type":"ContainerStarted","Data":"f8b1cdb62b3062e672d9f26a752fe3e8616a33263260fe1cfc8630865c8917a0"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.374693 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" event={"ID":"52da5870-8c41-475f-b1e8-3689fc3d43a6","Type":"ContainerStarted","Data":"a7c7b580965db3319ff5e8a842eb649bec5cc874f3c194ebae4550b602c40ebc"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.381555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.383032 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.883018922 +0000 UTC m=+228.162672369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.390126 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" event={"ID":"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5","Type":"ContainerStarted","Data":"eb12040721b0c47832646aabee24c1e7087221fee19f70882f0b58af9f0b2819"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.390161 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" event={"ID":"7aee0f31-69c3-4d4c-bc7b-c9a3123898f5","Type":"ContainerStarted","Data":"e9edd887f21e68561a3b3256960bd4739a307076b12dd291a2364a39545be0fa"} Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.392843 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" podStartSLOduration=166.392823091 podStartE2EDuration="2m46.392823091s" podCreationTimestamp="2026-03-13 11:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.366614651 +0000 UTC m=+227.646268098" watchObservedRunningTime="2026-03-13 11:50:40.392823091 +0000 UTC m=+227.672476538" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.393759 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-6rd6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.393812 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6rd6k" podUID="b7a97bba-7000-4634-9cfe-efcc38685708" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.419034 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m72sv" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.428928 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vjl6f" podStartSLOduration=167.428909002 podStartE2EDuration="2m47.428909002s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.413033456 +0000 UTC m=+227.692686903" watchObservedRunningTime="2026-03-13 11:50:40.428909002 +0000 UTC m=+227.708562449" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.429315 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" podStartSLOduration=167.429310554 podStartE2EDuration="2m47.429310554s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.428304566 +0000 UTC m=+227.707958023" watchObservedRunningTime="2026-03-13 11:50:40.429310554 +0000 UTC m=+227.708964021" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.469146 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c8g6q" podStartSLOduration=167.469130728 podStartE2EDuration="2m47.469130728s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.468676036 +0000 UTC m=+227.748329483" watchObservedRunningTime="2026-03-13 11:50:40.469130728 +0000 UTC m=+227.748784175" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.483254 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.483459 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:40.983444851 +0000 UTC m=+228.263098298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.483811 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.500144 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" podStartSLOduration=167.50012909 podStartE2EDuration="2m47.50012909s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.498409223 +0000 UTC m=+227.778062670" watchObservedRunningTime="2026-03-13 11:50:40.50012909 +0000 UTC m=+227.779782537" Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.504525 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.00450635 +0000 UTC m=+228.284159797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.583132 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" podStartSLOduration=167.583116412 podStartE2EDuration="2m47.583116412s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.581863207 +0000 UTC m=+227.861516664" watchObservedRunningTime="2026-03-13 11:50:40.583116412 +0000 UTC m=+227.862769859" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.611635 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.612047 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.112030816 +0000 UTC m=+228.391684263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.648580 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vcflh" podStartSLOduration=166.648558619 podStartE2EDuration="2m46.648558619s" podCreationTimestamp="2026-03-13 11:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.626953446 +0000 UTC m=+227.906606903" watchObservedRunningTime="2026-03-13 11:50:40.648558619 +0000 UTC m=+227.928212066" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.669645 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6bvx4" podStartSLOduration=167.669631209 podStartE2EDuration="2m47.669631209s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.649706201 +0000 UTC m=+227.929359648" watchObservedRunningTime="2026-03-13 11:50:40.669631209 +0000 UTC m=+227.949284656" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.672521 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" podStartSLOduration=167.672509198 podStartE2EDuration="2m47.672509198s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:40.667439759 +0000 UTC m=+227.947093206" watchObservedRunningTime="2026-03-13 11:50:40.672509198 +0000 UTC m=+227.952162645" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.706186 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47332: no serving certificate available for the kubelet" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.713927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.714373 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.214362028 +0000 UTC m=+228.494015475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.823360 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.823902 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.323874987 +0000 UTC m=+228.603528424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.824314 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47338: no serving certificate available for the kubelet" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.918912 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47340: no serving certificate available for the kubelet" Mar 13 11:50:40 crc kubenswrapper[4786]: I0313 11:50:40.935729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:40 crc kubenswrapper[4786]: E0313 11:50:40.936065 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.436050431 +0000 UTC m=+228.715703878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.030351 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47344: no serving certificate available for the kubelet" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.037087 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.037286 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.537268643 +0000 UTC m=+228.816922080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.038322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.038675 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.538667101 +0000 UTC m=+228.818320548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.114875 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47354: no serving certificate available for the kubelet" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.139673 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.139865 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.639838561 +0000 UTC m=+228.919491998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.139957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.140400 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.640393337 +0000 UTC m=+228.920046784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.183687 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:41 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:41 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:41 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.183738 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.214668 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.214986 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.225141 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47366: no serving certificate available for the kubelet" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.228059 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.240843 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.241143 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.741129625 +0000 UTC m=+229.020783072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.341589 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47380: no serving certificate available for the kubelet" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.341968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.342329 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.842316617 +0000 UTC m=+229.121970054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.419989 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k8t2r" event={"ID":"4bf3c97d-e226-404e-814c-1d9a9c525ab2","Type":"ContainerStarted","Data":"7f8a7bc48418cbf9c91d919137aec05ad3bbb2c473006a9f8ceaa280ff81fb3f"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.422826 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" event={"ID":"5c09ab49-3d49-495b-af13-5fd937259b53","Type":"ContainerStarted","Data":"d9e2a18ec26abb77781ac10c014721ebaa13bc4e975384eb1fb6072464617c55"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.423534 4786 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-62rhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.423559 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.430695 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" event={"ID":"7b90dd9c-139c-4527-ab2b-258022bca18a","Type":"ContainerStarted","Data":"8a5629f7f367ae47f66ceeb47c19e54544c682fd200234dfbf28184de370126f"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.430731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" event={"ID":"7b90dd9c-139c-4527-ab2b-258022bca18a","Type":"ContainerStarted","Data":"dc5124b3e547370f12afa92b50d494f7b63f966492fe9c1729fdedddba37e62a"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.430983 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.443266 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.443561 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:41.943543678 +0000 UTC m=+229.223197125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.455310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-md56c" event={"ID":"52da5870-8c41-475f-b1e8-3689fc3d43a6","Type":"ContainerStarted","Data":"d2cf97d50edd60c55009db610247f9e55ba6d77ce2122c7617efb19c599d0ab4"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.462406 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" podStartSLOduration=168.462388887 podStartE2EDuration="2m48.462388887s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.460618477 +0000 UTC m=+228.740271924" watchObservedRunningTime="2026-03-13 11:50:41.462388887 +0000 UTC m=+228.742042334" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.464726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" event={"ID":"096982f0-2695-4021-9db9-a705d08903b0","Type":"ContainerStarted","Data":"1ca130efadb5289c4d93190f92003b4b8bd45e867ae025753fe45911222a099b"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.483106 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" event={"ID":"957d3fc1-041b-405a-a88a-22b3ad8ad95a","Type":"ContainerStarted","Data":"089f5ff02976a30759b2fef3c5fb00ae5cec5aea056df796cc81cd95581f9d12"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.487295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" event={"ID":"1c3df92f-12d8-45f6-9369-c8b12f933e3c","Type":"ContainerStarted","Data":"18e0f0ae3625ba59dfc6a8919b0a66d7ce14e082c32b6ce733ce95d259a44816"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.487333 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" event={"ID":"1c3df92f-12d8-45f6-9369-c8b12f933e3c","Type":"ContainerStarted","Data":"f2e1b6b44a2ec4f2c48b198d97043f92eba5cd5703ad470785af44d11dd60730"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.490036 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" event={"ID":"344058e9-4126-475b-9108-e877dbb8201e","Type":"ContainerStarted","Data":"c48db08412652bd42c4fcadb3a25d46f3afd54de861fc524a97ee15e2bbd5b2d"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.490765 4786 patch_prober.go:28] interesting pod/console-operator-58897d9998-d5hqz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.490792 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" podUID="344058e9-4126-475b-9108-e877dbb8201e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.494862 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47390: no serving certificate available for the kubelet" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.496449 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" event={"ID":"a2c830be-ced3-4cf1-b515-de38f40e9418","Type":"ContainerStarted","Data":"191c1ac800054d2135d34a3378e828120dae9ddbd08ca574635d7fe476b249af"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.496499 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" event={"ID":"a2c830be-ced3-4cf1-b515-de38f40e9418","Type":"ContainerStarted","Data":"a31ee093ff6ea933ddb7b9c91d956ebc5f152d208de7fc6b11364cdfb86b7056"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.509514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hkcqf" event={"ID":"d4ad78e0-02e4-4013-9f61-9f344b5e3f15","Type":"ContainerStarted","Data":"91fbed6294b97c1bc4dbb6f75f47860e8ddd7cab1386ae4f77d3a9bf20096ac6"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.510934 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kpxm6" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.515026 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-cgkm9" podStartSLOduration=168.515008492 podStartE2EDuration="2m48.515008492s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.511185817 +0000 UTC m=+228.790839264" watchObservedRunningTime="2026-03-13 11:50:41.515008492 +0000 UTC m=+228.794661939" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.515629 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" event={"ID":"01e610d5-a3b8-4fc8-a472-01ab5bb625d5","Type":"ContainerStarted","Data":"68826c95e2bbf9fff2963f8ce2b41ff813e94f66dfd2232b9a4b5d7c3dbbebfc"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.520566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" event={"ID":"02e6f07f-826a-4784-bd45-87ee7984ad04","Type":"ContainerStarted","Data":"a6d7269a2a2d9d2f7c00420b762115e0b4ab6a168d3529c2de092ebbf5a7c76b"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.526533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" event={"ID":"7eb9105b-0760-456d-a4ed-7ef4543a7967","Type":"ContainerStarted","Data":"d0d324be40c9116c631e63d676c37fa469ed2f84853fc7b4f901143b9f72c68d"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.526599 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" event={"ID":"7eb9105b-0760-456d-a4ed-7ef4543a7967","Type":"ContainerStarted","Data":"3c1419461f27a40cb4e3abf2065266954c4fc3c2f4b079aedabcdd0b8c0f75b2"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.544375 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.547629 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.047616888 +0000 UTC m=+229.327270325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.544489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" event={"ID":"cb2b03d4-6c20-45cf-97a9-26ba24eff125","Type":"ContainerStarted","Data":"fbfc8dba201d950452c89cbf0e29127394a43685b89df7a05d4312480b6073a2"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.547987 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.557163 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbr8s" event={"ID":"8f345a34-32f4-4c84-beb8-079212add522","Type":"ContainerStarted","Data":"1e9641078669274ffba4633f1dd9487869e9d10b73a0d194f5baf57b60bfbee6"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.557202 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbr8s" event={"ID":"8f345a34-32f4-4c84-beb8-079212add522","Type":"ContainerStarted","Data":"0299c82bcc74d7fd271eacbd0af8d3123f4cef0f0c082bb359725ccf2bb917cb"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.557214 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.601417 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" event={"ID":"21153583-f155-40cd-b0f0-1841cdb9c20d","Type":"ContainerStarted","Data":"e8792f81a181b724fe93d69c9a42ec2b7a01a0175b78dfdbf0f3d6c9687ac1f7"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.602805 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-76tgr" podStartSLOduration=168.602790285 podStartE2EDuration="2m48.602790285s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.592253865 +0000 UTC m=+228.871907322" watchObservedRunningTime="2026-03-13 11:50:41.602790285 +0000 UTC m=+228.882443732" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.619177 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vw8v9" podStartSLOduration=168.619164745 podStartE2EDuration="2m48.619164745s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.618833576 +0000 UTC m=+228.898487023" watchObservedRunningTime="2026-03-13 11:50:41.619164745 +0000 UTC m=+228.898818192" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.622505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" event={"ID":"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9","Type":"ContainerStarted","Data":"7d41e1129a96dd6415f68746d913976595e0558fdd934377d081ab6ddcc6e575"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.622546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" event={"ID":"0ab2a788-5f46-45b0-a4a4-43dbfceeddc9","Type":"ContainerStarted","Data":"8ea589db966bcf04510405df474c45d345b964a8b5ff106b3dd3fa230bfc15a1"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.623334 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.632907 4786 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-557zt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.632960 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" podUID="0ab2a788-5f46-45b0-a4a4-43dbfceeddc9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.643378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2gm6m" event={"ID":"2ac0a51e-aad8-4046-b8de-1e463db2b6b2","Type":"ContainerStarted","Data":"d2edde3a68d7d9ce8fc9346aa1912bc3cbc2d713720243a002600cc33458f5ed"} Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.648124 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-6rd6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.648152 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6rd6k" podUID="b7a97bba-7000-4634-9cfe-efcc38685708" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.648608 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.649586 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.149574171 +0000 UTC m=+229.429227608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.661076 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.662417 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ftnwb" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.735074 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8dklp" podStartSLOduration=168.73504859 podStartE2EDuration="2m48.73504859s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.672148601 +0000 UTC m=+228.951802048" watchObservedRunningTime="2026-03-13 11:50:41.73504859 +0000 UTC m=+229.014702037" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.751359 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.751673 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.251661877 +0000 UTC m=+229.531315324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.779262 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" podStartSLOduration=168.779246995 podStartE2EDuration="2m48.779246995s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.737381644 +0000 UTC m=+229.017035091" watchObservedRunningTime="2026-03-13 11:50:41.779246995 +0000 UTC m=+229.058900442" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.812169 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.812672 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.814416 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qbr8s" podStartSLOduration=7.814401641 podStartE2EDuration="7.814401641s" podCreationTimestamp="2026-03-13 11:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.812369345 +0000 UTC m=+229.092022792" watchObservedRunningTime="2026-03-13 11:50:41.814401641 +0000 UTC m=+229.094055088" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.815242 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" podStartSLOduration=168.815237064 podStartE2EDuration="2m48.815237064s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.779604875 +0000 UTC m=+229.059258332" watchObservedRunningTime="2026-03-13 11:50:41.815237064 +0000 UTC m=+229.094890511" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.854645 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.854983 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.354968076 +0000 UTC m=+229.634621523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.911649 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2gm6m" podStartSLOduration=7.911635263 podStartE2EDuration="7.911635263s" podCreationTimestamp="2026-03-13 11:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.903212712 +0000 UTC m=+229.182866159" watchObservedRunningTime="2026-03-13 11:50:41.911635263 +0000 UTC m=+229.191288710" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.924787 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" podStartSLOduration=168.924770785 podStartE2EDuration="2m48.924770785s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:41.923243723 +0000 UTC m=+229.202897170" watchObservedRunningTime="2026-03-13 11:50:41.924770785 +0000 UTC m=+229.204424232" Mar 13 11:50:41 crc kubenswrapper[4786]: I0313 11:50:41.956488 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:41 crc kubenswrapper[4786]: E0313 11:50:41.956814 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.456803145 +0000 UTC m=+229.736456592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.057717 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.057895 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.557859392 +0000 UTC m=+229.837512839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.057989 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.058256 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.558243623 +0000 UTC m=+229.837897070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.158484 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.158656 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.658630222 +0000 UTC m=+229.938283669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.159015 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.159335 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.6593263 +0000 UTC m=+229.938979747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.162708 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8q2gx"] Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.162909 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" podUID="8a3e1982-e5ea-40c7-b606-7b4464d32971" containerName="controller-manager" containerID="cri-o://6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e" gracePeriod=30 Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.177558 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:42 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:42 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:42 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.177612 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.223910 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26"] Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.240028 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47394: no serving certificate available for the kubelet" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.260617 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.260748 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.760730248 +0000 UTC m=+230.040383695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.260907 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.261161 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.761153399 +0000 UTC m=+230.040806846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.364264 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.364434 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.864404937 +0000 UTC m=+230.144058384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.364782 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.365056 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.865048384 +0000 UTC m=+230.144701831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.471387 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.471659 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:42.971644204 +0000 UTC m=+230.251297651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.572439 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.573166 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.073152604 +0000 UTC m=+230.352806061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.665342 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.674444 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.674775 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.174760896 +0000 UTC m=+230.454414343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.686188 4786 generic.go:334] "Generic (PLEG): container finished" podID="8a3e1982-e5ea-40c7-b606-7b4464d32971" containerID="6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e" exitCode=0 Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.686300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" event={"ID":"8a3e1982-e5ea-40c7-b606-7b4464d32971","Type":"ContainerDied","Data":"6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e"} Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.686328 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" event={"ID":"8a3e1982-e5ea-40c7-b606-7b4464d32971","Type":"ContainerDied","Data":"c986b9fffad44768268dde0d0c00c679d96aca70290e7a8cd74be887713de92a"} Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.686343 4786 scope.go:117] "RemoveContainer" containerID="6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.686440 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8q2gx" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.697227 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" event={"ID":"02e6f07f-826a-4784-bd45-87ee7984ad04","Type":"ContainerStarted","Data":"2020fe369552f897ff174e2447ea549f513c38af3c6c1857425fc9d787d6fb2b"} Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.698575 4786 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-62rhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.698615 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.708919 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-d5hqz" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.753046 4786 scope.go:117] "RemoveContainer" containerID="6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.766792 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e\": container with ID starting with 6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e not found: ID does not exist" containerID="6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.766835 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e"} err="failed to get container status \"6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e\": rpc error: code = NotFound desc = could not find container \"6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e\": container with ID starting with 6b61fd2a63ec262168a819e7d58e9dbdceec4f968d02da4402df829d0401961e not found: ID does not exist" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.778195 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-config\") pod \"8a3e1982-e5ea-40c7-b606-7b4464d32971\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.778248 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e1982-e5ea-40c7-b606-7b4464d32971-serving-cert\") pod \"8a3e1982-e5ea-40c7-b606-7b4464d32971\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.778289 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t96f\" (UniqueName: \"kubernetes.io/projected/8a3e1982-e5ea-40c7-b606-7b4464d32971-kube-api-access-2t96f\") pod \"8a3e1982-e5ea-40c7-b606-7b4464d32971\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.778330 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-proxy-ca-bundles\") pod \"8a3e1982-e5ea-40c7-b606-7b4464d32971\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.778348 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-client-ca\") pod \"8a3e1982-e5ea-40c7-b606-7b4464d32971\" (UID: \"8a3e1982-e5ea-40c7-b606-7b4464d32971\") " Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.778534 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.781720 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-config" (OuterVolumeSpecName: "config") pod "8a3e1982-e5ea-40c7-b606-7b4464d32971" (UID: "8a3e1982-e5ea-40c7-b606-7b4464d32971"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.783796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-client-ca" (OuterVolumeSpecName: "client-ca") pod "8a3e1982-e5ea-40c7-b606-7b4464d32971" (UID: "8a3e1982-e5ea-40c7-b606-7b4464d32971"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.785142 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8a3e1982-e5ea-40c7-b606-7b4464d32971" (UID: "8a3e1982-e5ea-40c7-b606-7b4464d32971"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.789537 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.28952063 +0000 UTC m=+230.569174077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.790296 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3e1982-e5ea-40c7-b606-7b4464d32971-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8a3e1982-e5ea-40c7-b606-7b4464d32971" (UID: "8a3e1982-e5ea-40c7-b606-7b4464d32971"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.792808 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3e1982-e5ea-40c7-b606-7b4464d32971-kube-api-access-2t96f" (OuterVolumeSpecName: "kube-api-access-2t96f") pod "8a3e1982-e5ea-40c7-b606-7b4464d32971" (UID: "8a3e1982-e5ea-40c7-b606-7b4464d32971"). InnerVolumeSpecName "kube-api-access-2t96f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880321 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.880571 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.380519081 +0000 UTC m=+230.660172518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880849 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880859 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e1982-e5ea-40c7-b606-7b4464d32971-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880868 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t96f\" (UniqueName: \"kubernetes.io/projected/8a3e1982-e5ea-40c7-b606-7b4464d32971-kube-api-access-2t96f\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880892 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.880900 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e1982-e5ea-40c7-b606-7b4464d32971-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.881217 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.381208341 +0000 UTC m=+230.660861788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.982473 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.982642 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.482616237 +0000 UTC m=+230.762269684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:42 crc kubenswrapper[4786]: I0313 11:50:42.982805 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:42 crc kubenswrapper[4786]: E0313 11:50:42.983196 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.483181333 +0000 UTC m=+230.762834780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.015824 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8q2gx"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.019081 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8q2gx"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.021552 4786 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.023237 4786 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pckgr container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]log ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]etcd ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/max-in-flight-filter ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 13 11:50:43 crc kubenswrapper[4786]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 13 11:50:43 crc kubenswrapper[4786]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/project.openshift.io-projectcache ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-startinformers ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 13 11:50:43 crc kubenswrapper[4786]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 11:50:43 crc kubenswrapper[4786]: livez check failed Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.023291 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" podUID="21153583-f155-40cd-b0f0-1841cdb9c20d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.083428 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.083613 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.583590492 +0000 UTC m=+230.863243939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.083671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.084161 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.584144208 +0000 UTC m=+230.863797655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.172430 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:43 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:43 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:43 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.172495 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.185068 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.185269 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.685234646 +0000 UTC m=+230.964888093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.185707 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.186056 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.686040419 +0000 UTC m=+230.965693866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.239848 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j6thk" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.245987 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94x7m"] Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.246194 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3e1982-e5ea-40c7-b606-7b4464d32971" containerName="controller-manager" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.246210 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3e1982-e5ea-40c7-b606-7b4464d32971" containerName="controller-manager" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.246324 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3e1982-e5ea-40c7-b606-7b4464d32971" containerName="controller-manager" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.247040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.247616 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-557zt" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.249177 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.298508 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.299065 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.799049674 +0000 UTC m=+231.078703121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.301816 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94x7m"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.400685 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.400730 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzfjw\" (UniqueName: \"kubernetes.io/projected/f8353c7b-cabe-46a6-8a98-aea4bad6b499-kube-api-access-dzfjw\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.400785 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-catalog-content\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.400812 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-utilities\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.401125 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:43.901112739 +0000 UTC m=+231.180766186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.451024 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3e1982-e5ea-40c7-b606-7b4464d32971" path="/var/lib/kubelet/pods/8a3e1982-e5ea-40c7-b606-7b4464d32971/volumes" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.451435 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4tp7g"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.452281 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.454059 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.455532 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tp7g"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.502318 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.502540 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-utilities\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.502624 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzfjw\" (UniqueName: \"kubernetes.io/projected/f8353c7b-cabe-46a6-8a98-aea4bad6b499-kube-api-access-dzfjw\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.502672 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-catalog-content\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.503136 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:44.003118013 +0000 UTC m=+231.282771460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.503382 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-catalog-content\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.503447 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-utilities\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.525805 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzfjw\" (UniqueName: \"kubernetes.io/projected/f8353c7b-cabe-46a6-8a98-aea4bad6b499-kube-api-access-dzfjw\") pod \"certified-operators-94x7m\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.550012 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47404: no serving certificate available for the kubelet" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.603631 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-catalog-content\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.603680 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9w9g\" (UniqueName: \"kubernetes.io/projected/939749d8-2927-47a2-8edc-77b4f307e813-kube-api-access-m9w9g\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.603724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.603787 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-utilities\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.604182 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:44.10416607 +0000 UTC m=+231.383819517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.605269 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.637471 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f4982"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.638970 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.648746 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4982"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706212 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.706339 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:44.206322437 +0000 UTC m=+231.485975884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706499 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-utilities\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706525 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-utilities\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-catalog-content\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c2rc\" (UniqueName: \"kubernetes.io/projected/be9e61e1-45b9-42e3-899f-495a710537fc-kube-api-access-5c2rc\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706625 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9w9g\" (UniqueName: \"kubernetes.io/projected/939749d8-2927-47a2-8edc-77b4f307e813-kube-api-access-m9w9g\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.706642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-catalog-content\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: E0313 11:50:43.706725 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:50:44.206717748 +0000 UTC m=+231.486371195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4rkw" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.707024 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-utilities\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.707157 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-catalog-content\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.708390 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" event={"ID":"02e6f07f-826a-4784-bd45-87ee7984ad04","Type":"ContainerStarted","Data":"977104a68863d0c9555ca5163bc815bd789238afedd897ae37c95696f1f90b0d"} Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.708423 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" event={"ID":"02e6f07f-826a-4784-bd45-87ee7984ad04","Type":"ContainerStarted","Data":"a9ccfec383e1f88733b239f0c14a19e51ade2db35a58709742c60c8ab28055f3"} Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.709752 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" podUID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" containerName="route-controller-manager" containerID="cri-o://99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c" gracePeriod=30 Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.727023 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fgsg7" podStartSLOduration=9.727004556 podStartE2EDuration="9.727004556s" podCreationTimestamp="2026-03-13 11:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:43.725844834 +0000 UTC m=+231.005498271" watchObservedRunningTime="2026-03-13 11:50:43.727004556 +0000 UTC m=+231.006658003" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.740257 4786 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T11:50:43.021571948Z","Handler":null,"Name":""} Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.744972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9w9g\" (UniqueName: \"kubernetes.io/projected/939749d8-2927-47a2-8edc-77b4f307e813-kube-api-access-m9w9g\") pod \"community-operators-4tp7g\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.745971 4786 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.746001 4786 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.761831 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bd9b66b4b-sn528"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.762419 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.763402 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.763601 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.767984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd9b66b4b-sn528"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.769826 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.770478 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.770517 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.770642 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.770756 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.773532 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.807300 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.807606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c2rc\" (UniqueName: \"kubernetes.io/projected/be9e61e1-45b9-42e3-899f-495a710537fc-kube-api-access-5c2rc\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.807657 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-catalog-content\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.807807 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-utilities\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.809791 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-utilities\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.810677 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-catalog-content\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.814467 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.830701 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c2rc\" (UniqueName: \"kubernetes.io/projected/be9e61e1-45b9-42e3-899f-495a710537fc-kube-api-access-5c2rc\") pod \"certified-operators-f4982\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.835372 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9bkf6"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.836329 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.844266 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9bkf6"] Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909500 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-proxy-ca-bundles\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909546 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-catalog-content\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909566 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb55fe-bd61-406c-8bc2-c7963fabd631-serving-cert\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909622 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-utilities\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zj65\" (UniqueName: \"kubernetes.io/projected/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-kube-api-access-7zj65\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-config\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-client-ca\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909718 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqv8v\" (UniqueName: \"kubernetes.io/projected/25fb55fe-bd61-406c-8bc2-c7963fabd631-kube-api-access-vqv8v\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.909740 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.957017 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.957294 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:43 crc kubenswrapper[4786]: I0313 11:50:43.965498 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.005459 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tp7g"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.011169 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-utilities\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.011207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zj65\" (UniqueName: \"kubernetes.io/projected/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-kube-api-access-7zj65\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.011250 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-config\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.014400 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-utilities\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.015136 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-client-ca\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.015192 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqv8v\" (UniqueName: \"kubernetes.io/projected/25fb55fe-bd61-406c-8bc2-c7963fabd631-kube-api-access-vqv8v\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.016002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-proxy-ca-bundles\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.017307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-catalog-content\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.017340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb55fe-bd61-406c-8bc2-c7963fabd631-serving-cert\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.017513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-client-ca\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.017840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-proxy-ca-bundles\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.017863 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-catalog-content\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.018962 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-config\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.024495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb55fe-bd61-406c-8bc2-c7963fabd631-serving-cert\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.031945 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zj65\" (UniqueName: \"kubernetes.io/projected/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-kube-api-access-7zj65\") pod \"community-operators-9bkf6\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.034846 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqv8v\" (UniqueName: \"kubernetes.io/projected/25fb55fe-bd61-406c-8bc2-c7963fabd631-kube-api-access-vqv8v\") pod \"controller-manager-bd9b66b4b-sn528\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.048836 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4rkw\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.055231 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94x7m"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.116916 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.158668 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.176185 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:44 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:44 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:44 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.176237 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.219759 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.249559 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4982"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.322214 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a94b2e7-89b4-43f9-b7f6-3a433803914e-serving-cert\") pod \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.322265 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-client-ca\") pod \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.322367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zs2h\" (UniqueName: \"kubernetes.io/projected/6a94b2e7-89b4-43f9-b7f6-3a433803914e-kube-api-access-9zs2h\") pod \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.322406 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-config\") pod \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\" (UID: \"6a94b2e7-89b4-43f9-b7f6-3a433803914e\") " Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.323720 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-client-ca" (OuterVolumeSpecName: "client-ca") pod "6a94b2e7-89b4-43f9-b7f6-3a433803914e" (UID: "6a94b2e7-89b4-43f9-b7f6-3a433803914e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.324011 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-config" (OuterVolumeSpecName: "config") pod "6a94b2e7-89b4-43f9-b7f6-3a433803914e" (UID: "6a94b2e7-89b4-43f9-b7f6-3a433803914e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.330494 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a94b2e7-89b4-43f9-b7f6-3a433803914e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6a94b2e7-89b4-43f9-b7f6-3a433803914e" (UID: "6a94b2e7-89b4-43f9-b7f6-3a433803914e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.333334 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a94b2e7-89b4-43f9-b7f6-3a433803914e-kube-api-access-9zs2h" (OuterVolumeSpecName: "kube-api-access-9zs2h") pod "6a94b2e7-89b4-43f9-b7f6-3a433803914e" (UID: "6a94b2e7-89b4-43f9-b7f6-3a433803914e"). InnerVolumeSpecName "kube-api-access-9zs2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.338640 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.393967 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd9b66b4b-sn528"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.419691 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9bkf6"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.424108 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.424155 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a94b2e7-89b4-43f9-b7f6-3a433803914e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.424163 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a94b2e7-89b4-43f9-b7f6-3a433803914e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.424172 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zs2h\" (UniqueName: \"kubernetes.io/projected/6a94b2e7-89b4-43f9-b7f6-3a433803914e-kube-api-access-9zs2h\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.713450 4786 generic.go:334] "Generic (PLEG): container finished" podID="939749d8-2927-47a2-8edc-77b4f307e813" containerID="46563af448b1b26370749e0671a79020737c4274364c9f2f18f70e237f268b80" exitCode=0 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.713708 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tp7g" event={"ID":"939749d8-2927-47a2-8edc-77b4f307e813","Type":"ContainerDied","Data":"46563af448b1b26370749e0671a79020737c4274364c9f2f18f70e237f268b80"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.713730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tp7g" event={"ID":"939749d8-2927-47a2-8edc-77b4f307e813","Type":"ContainerStarted","Data":"a59162bb35190dc1251c09f88ad34faf7fd44d19291aa1b534f6560aea04f927"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.716750 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" event={"ID":"25fb55fe-bd61-406c-8bc2-c7963fabd631","Type":"ContainerStarted","Data":"2803d206aebd07015bbe65c88f09e52f90b20dbd6603bbb10c0e35bfd8197ac7"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.716794 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" event={"ID":"25fb55fe-bd61-406c-8bc2-c7963fabd631","Type":"ContainerStarted","Data":"62f162165299dd04ee5cfb1a7e3c5405f75b0bb9c4cd46fd5d1a3f3d8598e106"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.717015 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.718636 4786 generic.go:334] "Generic (PLEG): container finished" podID="be9e61e1-45b9-42e3-899f-495a710537fc" containerID="4c3959c467d5b50937d21f390d8b593ce05082dc9423208c9f1fffcd068b3b33" exitCode=0 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.718686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4982" event={"ID":"be9e61e1-45b9-42e3-899f-495a710537fc","Type":"ContainerDied","Data":"4c3959c467d5b50937d21f390d8b593ce05082dc9423208c9f1fffcd068b3b33"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.718700 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4982" event={"ID":"be9e61e1-45b9-42e3-899f-495a710537fc","Type":"ContainerStarted","Data":"2de0f548e7130f2b0b682fe631a01388002b8ca99858012c803865213bebd447"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.719321 4786 patch_prober.go:28] interesting pod/controller-manager-bd9b66b4b-sn528 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.719364 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.722648 4786 generic.go:334] "Generic (PLEG): container finished" podID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" containerID="99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c" exitCode=0 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.722685 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.722727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" event={"ID":"6a94b2e7-89b4-43f9-b7f6-3a433803914e","Type":"ContainerDied","Data":"99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.722770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26" event={"ID":"6a94b2e7-89b4-43f9-b7f6-3a433803914e","Type":"ContainerDied","Data":"63b1e5b629336b9097e5fda385b3fbd93842b96649027933c65125a66b8dea49"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.722789 4786 scope.go:117] "RemoveContainer" containerID="99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.727591 4786 generic.go:334] "Generic (PLEG): container finished" podID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerID="775192d48c7e0e41b5b9166f5544b91349e6cb1867f234cd246ed08574a9cd24" exitCode=0 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.727655 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerDied","Data":"775192d48c7e0e41b5b9166f5544b91349e6cb1867f234cd246ed08574a9cd24"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.727682 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerStarted","Data":"dd283f372163a5d004c4ac6cc08d9aca038e911762e37ff1f5dd3c483ed4d68a"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.732003 4786 generic.go:334] "Generic (PLEG): container finished" podID="4675c625-0d2c-4358-9241-627d96dcb2f0" containerID="56c05258f4546d6e676a68dfb301418f6e7f8fcce3cd8dcf8438a2b735e7eb25" exitCode=0 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.732078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" event={"ID":"4675c625-0d2c-4358-9241-627d96dcb2f0","Type":"ContainerDied","Data":"56c05258f4546d6e676a68dfb301418f6e7f8fcce3cd8dcf8438a2b735e7eb25"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.733889 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerID="a44f34f006e0d256df37e66b8eef7e0be9929451119df2928a03a9bf629c8fb7" exitCode=0 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.733978 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bkf6" event={"ID":"1ca57952-a8b4-45bc-bf5a-1ddd025835c9","Type":"ContainerDied","Data":"a44f34f006e0d256df37e66b8eef7e0be9929451119df2928a03a9bf629c8fb7"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.734021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bkf6" event={"ID":"1ca57952-a8b4-45bc-bf5a-1ddd025835c9","Type":"ContainerStarted","Data":"5fd76ee01716ca20fd9dc31bed28c2e80f3cafbb12f968481dc98f7675f73650"} Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.760898 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4rkw"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.761664 4786 scope.go:117] "RemoveContainer" containerID="99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c" Mar 13 11:50:44 crc kubenswrapper[4786]: E0313 11:50:44.762205 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca57952_a8b4_45bc_bf5a_1ddd025835c9.slice/crio-conmon-a44f34f006e0d256df37e66b8eef7e0be9929451119df2928a03a9bf629c8fb7.scope\": RecentStats: unable to find data in memory cache]" Mar 13 11:50:44 crc kubenswrapper[4786]: E0313 11:50:44.762629 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c\": container with ID starting with 99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c not found: ID does not exist" containerID="99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.762673 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c"} err="failed to get container status \"99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c\": rpc error: code = NotFound desc = could not find container \"99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c\": container with ID starting with 99c0ed775d702d141048013be0e55335d0d3a75e2528388ae0c763c0b8a12c1c not found: ID does not exist" Mar 13 11:50:44 crc kubenswrapper[4786]: W0313 11:50:44.777689 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525e850e_04a9_4dc1_91ab_a508136a5e60.slice/crio-ca14c9c79fb4546f60938a5650d957817ad2434123a8236adb1e0ead3ed98a44 WatchSource:0}: Error finding container ca14c9c79fb4546f60938a5650d957817ad2434123a8236adb1e0ead3ed98a44: Status 404 returned error can't find the container with id ca14c9c79fb4546f60938a5650d957817ad2434123a8236adb1e0ead3ed98a44 Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.790103 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" podStartSLOduration=2.790085613 podStartE2EDuration="2.790085613s" podCreationTimestamp="2026-03-13 11:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:44.788409427 +0000 UTC m=+232.068062884" watchObservedRunningTime="2026-03-13 11:50:44.790085613 +0000 UTC m=+232.069739080" Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.822247 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26"] Mar 13 11:50:44 crc kubenswrapper[4786]: I0313 11:50:44.824917 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m9k26"] Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.175537 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:45 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:45 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:45 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.175871 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.239142 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kmpwr"] Mar 13 11:50:45 crc kubenswrapper[4786]: E0313 11:50:45.239537 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" containerName="route-controller-manager" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.239606 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" containerName="route-controller-manager" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.239813 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" containerName="route-controller-manager" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.240637 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.243740 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.253524 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmpwr"] Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.337157 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdgbk\" (UniqueName: \"kubernetes.io/projected/1d680740-f193-4a69-8755-d766703cd61a-kube-api-access-cdgbk\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.337212 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-catalog-content\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.337244 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-utilities\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.438738 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdgbk\" (UniqueName: \"kubernetes.io/projected/1d680740-f193-4a69-8755-d766703cd61a-kube-api-access-cdgbk\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.438800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-catalog-content\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.438832 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-utilities\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.439355 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-utilities\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.439428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-catalog-content\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.448194 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a94b2e7-89b4-43f9-b7f6-3a433803914e" path="/var/lib/kubelet/pods/6a94b2e7-89b4-43f9-b7f6-3a433803914e/volumes" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.449001 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.454753 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdgbk\" (UniqueName: \"kubernetes.io/projected/1d680740-f193-4a69-8755-d766703cd61a-kube-api-access-cdgbk\") pod \"redhat-marketplace-kmpwr\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.554718 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.652382 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8bjpz"] Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.665777 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.666543 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bjpz"] Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.750726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-utilities\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.750962 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn27l\" (UniqueName: \"kubernetes.io/projected/6cf9b878-214e-46cc-b417-49a01c7b5fc9-kube-api-access-hn27l\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.750984 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-catalog-content\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.754917 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" event={"ID":"525e850e-04a9-4dc1-91ab-a508136a5e60","Type":"ContainerStarted","Data":"6da584293915363685a1191078c482657bfd840a53e8c21dab386070ed6a0e5e"} Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.754990 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" event={"ID":"525e850e-04a9-4dc1-91ab-a508136a5e60","Type":"ContainerStarted","Data":"ca14c9c79fb4546f60938a5650d957817ad2434123a8236adb1e0ead3ed98a44"} Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.755196 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.761009 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff"] Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.761575 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.763542 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.763849 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.764137 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.765979 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.766163 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.766286 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.768935 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.774242 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" podStartSLOduration=172.7742194 podStartE2EDuration="2m52.7742194s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.770115268 +0000 UTC m=+233.049768735" watchObservedRunningTime="2026-03-13 11:50:45.7742194 +0000 UTC m=+233.053872847" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.775519 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff"] Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853509 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn27l\" (UniqueName: \"kubernetes.io/projected/6cf9b878-214e-46cc-b417-49a01c7b5fc9-kube-api-access-hn27l\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853567 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-utilities\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-catalog-content\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853609 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca20ab6-bb5c-45e0-b6d3-37725a376013-serving-cert\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853649 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndkn\" (UniqueName: \"kubernetes.io/projected/3ca20ab6-bb5c-45e0-b6d3-37725a376013-kube-api-access-tndkn\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853699 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-config\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.853759 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-client-ca\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.855522 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-utilities\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.855739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-catalog-content\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.875209 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn27l\" (UniqueName: \"kubernetes.io/projected/6cf9b878-214e-46cc-b417-49a01c7b5fc9-kube-api-access-hn27l\") pod \"redhat-marketplace-8bjpz\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.956038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca20ab6-bb5c-45e0-b6d3-37725a376013-serving-cert\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.956086 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tndkn\" (UniqueName: \"kubernetes.io/projected/3ca20ab6-bb5c-45e0-b6d3-37725a376013-kube-api-access-tndkn\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.956123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-config\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.956164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-client-ca\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.957270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-client-ca\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.961021 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-config\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.964377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca20ab6-bb5c-45e0-b6d3-37725a376013-serving-cert\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.986736 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:50:45 crc kubenswrapper[4786]: I0313 11:50:45.995861 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndkn\" (UniqueName: \"kubernetes.io/projected/3ca20ab6-bb5c-45e0-b6d3-37725a376013-kube-api-access-tndkn\") pod \"route-controller-manager-698ff8b74-ksfff\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.007914 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmpwr"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.086376 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.136737 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47410: no serving certificate available for the kubelet" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.171025 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.176420 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:46 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:46 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:46 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.176464 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.259769 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k7js\" (UniqueName: \"kubernetes.io/projected/4675c625-0d2c-4358-9241-627d96dcb2f0-kube-api-access-4k7js\") pod \"4675c625-0d2c-4358-9241-627d96dcb2f0\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.259915 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4675c625-0d2c-4358-9241-627d96dcb2f0-config-volume\") pod \"4675c625-0d2c-4358-9241-627d96dcb2f0\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.259936 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4675c625-0d2c-4358-9241-627d96dcb2f0-secret-volume\") pod \"4675c625-0d2c-4358-9241-627d96dcb2f0\" (UID: \"4675c625-0d2c-4358-9241-627d96dcb2f0\") " Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.264607 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4675c625-0d2c-4358-9241-627d96dcb2f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "4675c625-0d2c-4358-9241-627d96dcb2f0" (UID: "4675c625-0d2c-4358-9241-627d96dcb2f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.272459 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4675c625-0d2c-4358-9241-627d96dcb2f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4675c625-0d2c-4358-9241-627d96dcb2f0" (UID: "4675c625-0d2c-4358-9241-627d96dcb2f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.273737 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4675c625-0d2c-4358-9241-627d96dcb2f0-kube-api-access-4k7js" (OuterVolumeSpecName: "kube-api-access-4k7js") pod "4675c625-0d2c-4358-9241-627d96dcb2f0" (UID: "4675c625-0d2c-4358-9241-627d96dcb2f0"). InnerVolumeSpecName "kube-api-access-4k7js". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.361809 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4675c625-0d2c-4358-9241-627d96dcb2f0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.361839 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4675c625-0d2c-4358-9241-627d96dcb2f0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.361848 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k7js\" (UniqueName: \"kubernetes.io/projected/4675c625-0d2c-4358-9241-627d96dcb2f0-kube-api-access-4k7js\") on node \"crc\" DevicePath \"\"" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.436950 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4gfdq"] Mar 13 11:50:46 crc kubenswrapper[4786]: E0313 11:50:46.437158 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4675c625-0d2c-4358-9241-627d96dcb2f0" containerName="collect-profiles" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.437169 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4675c625-0d2c-4358-9241-627d96dcb2f0" containerName="collect-profiles" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.437275 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4675c625-0d2c-4358-9241-627d96dcb2f0" containerName="collect-profiles" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.438489 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.441202 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.444623 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gfdq"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.463253 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.466165 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.467530 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.468265 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.468524 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.527268 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bjpz"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.553066 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.564829 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180ce447-e137-492a-bccd-f40344492a31-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.564899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zh9k\" (UniqueName: \"kubernetes.io/projected/17bbca1c-a838-4407-834c-45b6129b32b8-kube-api-access-4zh9k\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.564955 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180ce447-e137-492a-bccd-f40344492a31-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.564977 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-catalog-content\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.565008 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-utilities\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.572136 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.572860 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.581599 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.581805 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.586470 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666027 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180ce447-e137-492a-bccd-f40344492a31-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666088 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zh9k\" (UniqueName: \"kubernetes.io/projected/17bbca1c-a838-4407-834c-45b6129b32b8-kube-api-access-4zh9k\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666129 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad244ddf-d679-419f-9308-5584ea9b3e04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180ce447-e137-492a-bccd-f40344492a31-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180ce447-e137-492a-bccd-f40344492a31-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666195 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-catalog-content\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666429 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-utilities\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.666474 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad244ddf-d679-419f-9308-5584ea9b3e04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.667018 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-catalog-content\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.667072 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-utilities\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.684263 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180ce447-e137-492a-bccd-f40344492a31-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.685126 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zh9k\" (UniqueName: \"kubernetes.io/projected/17bbca1c-a838-4407-834c-45b6129b32b8-kube-api-access-4zh9k\") pod \"redhat-operators-4gfdq\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.756471 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-6rd6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.756526 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6rd6k" podUID="b7a97bba-7000-4634-9cfe-efcc38685708" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.756654 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-6rd6k container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.756702 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6rd6k" podUID="b7a97bba-7000-4634-9cfe-efcc38685708" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.767668 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad244ddf-d679-419f-9308-5584ea9b3e04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.767934 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad244ddf-d679-419f-9308-5584ea9b3e04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.769065 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad244ddf-d679-419f-9308-5584ea9b3e04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.788606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad244ddf-d679-419f-9308-5584ea9b3e04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.790232 4786 generic.go:334] "Generic (PLEG): container finished" podID="1d680740-f193-4a69-8755-d766703cd61a" containerID="4d2635fac95e0680bdf9c0ad1ee7a1a8cf22957d4d1862e0b43dfc3ff765a6c0" exitCode=0 Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.790321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmpwr" event={"ID":"1d680740-f193-4a69-8755-d766703cd61a","Type":"ContainerDied","Data":"4d2635fac95e0680bdf9c0ad1ee7a1a8cf22957d4d1862e0b43dfc3ff765a6c0"} Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.790358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmpwr" event={"ID":"1d680740-f193-4a69-8755-d766703cd61a","Type":"ContainerStarted","Data":"d86d7b7d29aa0acae548f9e0163b3ce579d57e2eefb905b6b761f88af6bba6c4"} Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.792678 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.800424 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf" event={"ID":"4675c625-0d2c-4358-9241-627d96dcb2f0","Type":"ContainerDied","Data":"f8e659878e65e3880486e67991658b742f27c88616b241ce98a0fd07812881db"} Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.800440 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e659878e65e3880486e67991658b742f27c88616b241ce98a0fd07812881db" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.801729 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.808149 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.815588 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.820764 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pckgr" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.844464 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xftvt"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.845450 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.856343 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.857421 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.858228 4786 patch_prober.go:28] interesting pod/console-f9d7485db-nzss5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.858267 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nzss5" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.862205 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xftvt"] Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.894505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.977653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-catalog-content\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.977775 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-utilities\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:46 crc kubenswrapper[4786]: I0313 11:50:46.977795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gx7v\" (UniqueName: \"kubernetes.io/projected/a6b7548d-0202-4690-b267-90076b5e4687-kube-api-access-8gx7v\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.079274 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-catalog-content\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.079368 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-utilities\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.079400 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx7v\" (UniqueName: \"kubernetes.io/projected/a6b7548d-0202-4690-b267-90076b5e4687-kube-api-access-8gx7v\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.079810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-catalog-content\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.079954 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-utilities\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.095016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gx7v\" (UniqueName: \"kubernetes.io/projected/a6b7548d-0202-4690-b267-90076b5e4687-kube-api-access-8gx7v\") pod \"redhat-operators-xftvt\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.169797 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.173728 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:47 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:47 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:47 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.173787 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.176189 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.476092 4786 ???:1] "http: TLS handshake error from 192.168.126.11:47418: no serving certificate available for the kubelet" Mar 13 11:50:47 crc kubenswrapper[4786]: I0313 11:50:47.581159 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:50:48 crc kubenswrapper[4786]: I0313 11:50:48.172550 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:48 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:48 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:48 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:48 crc kubenswrapper[4786]: I0313 11:50:48.172835 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:49 crc kubenswrapper[4786]: I0313 11:50:49.172002 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:49 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:49 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:49 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:49 crc kubenswrapper[4786]: I0313 11:50:49.172103 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:50 crc kubenswrapper[4786]: I0313 11:50:50.176622 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:50 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:50 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:50 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:50 crc kubenswrapper[4786]: I0313 11:50:50.176687 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:51 crc kubenswrapper[4786]: I0313 11:50:51.171574 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:51 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:51 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:51 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:51 crc kubenswrapper[4786]: I0313 11:50:51.171629 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:51 crc kubenswrapper[4786]: I0313 11:50:51.287704 4786 ???:1] "http: TLS handshake error from 192.168.126.11:59334: no serving certificate available for the kubelet" Mar 13 11:50:52 crc kubenswrapper[4786]: I0313 11:50:52.171923 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:52 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:52 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:52 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:52 crc kubenswrapper[4786]: I0313 11:50:52.172337 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:52 crc kubenswrapper[4786]: I0313 11:50:52.649871 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qbr8s" Mar 13 11:50:52 crc kubenswrapper[4786]: I0313 11:50:52.831666 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bjpz" event={"ID":"6cf9b878-214e-46cc-b417-49a01c7b5fc9","Type":"ContainerStarted","Data":"e75ecc1523e9765f37010262ad1fc8c9f6b9708b298b3b0fc7169888779985c8"} Mar 13 11:50:52 crc kubenswrapper[4786]: I0313 11:50:52.832694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" event={"ID":"3ca20ab6-bb5c-45e0-b6d3-37725a376013","Type":"ContainerStarted","Data":"a68bb67e003b3589879589a48cda527ad129980481ff511dd8a8baef160295fb"} Mar 13 11:50:53 crc kubenswrapper[4786]: I0313 11:50:53.172202 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:53 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:53 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:53 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:53 crc kubenswrapper[4786]: I0313 11:50:53.172262 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:54 crc kubenswrapper[4786]: I0313 11:50:54.172551 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:54 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:54 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:54 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:54 crc kubenswrapper[4786]: I0313 11:50:54.172911 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:55 crc kubenswrapper[4786]: I0313 11:50:55.172844 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:55 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:55 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:55 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:55 crc kubenswrapper[4786]: I0313 11:50:55.173269 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:56 crc kubenswrapper[4786]: I0313 11:50:56.172714 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:56 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:56 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:56 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:56 crc kubenswrapper[4786]: I0313 11:50:56.172784 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:56 crc kubenswrapper[4786]: I0313 11:50:56.761924 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6rd6k" Mar 13 11:50:56 crc kubenswrapper[4786]: I0313 11:50:56.856794 4786 patch_prober.go:28] interesting pod/console-f9d7485db-nzss5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 13 11:50:56 crc kubenswrapper[4786]: I0313 11:50:56.857158 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nzss5" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 13 11:50:57 crc kubenswrapper[4786]: I0313 11:50:57.172812 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:57 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:57 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:57 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:57 crc kubenswrapper[4786]: I0313 11:50:57.172864 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:58 crc kubenswrapper[4786]: I0313 11:50:58.172421 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:58 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:58 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:58 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:58 crc kubenswrapper[4786]: I0313 11:50:58.172536 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:50:58 crc kubenswrapper[4786]: E0313 11:50:58.189448 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 11:50:58 crc kubenswrapper[4786]: E0313 11:50:58.189608 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 11:50:58 crc kubenswrapper[4786]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 11:50:58 crc kubenswrapper[4786]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q5qp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556710-bmgbc_openshift-infra(01e610d5-a3b8-4fc8-a472-01ab5bb625d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 11:50:58 crc kubenswrapper[4786]: > logger="UnhandledError" Mar 13 11:50:58 crc kubenswrapper[4786]: E0313 11:50:58.190851 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" podUID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" Mar 13 11:50:58 crc kubenswrapper[4786]: E0313 11:50:58.884961 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" podUID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" Mar 13 11:50:58 crc kubenswrapper[4786]: I0313 11:50:58.961837 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:58 crc kubenswrapper[4786]: I0313 11:50:58.966674 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:50:58 crc kubenswrapper[4786]: I0313 11:50:58.982708 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19009bf-0d5a-458f-8c3e-97bc203741b1-metrics-certs\") pod \"network-metrics-daemon-g4pzt\" (UID: \"c19009bf-0d5a-458f-8c3e-97bc203741b1\") " pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:59 crc kubenswrapper[4786]: I0313 11:50:59.159276 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:50:59 crc kubenswrapper[4786]: I0313 11:50:59.167749 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g4pzt" Mar 13 11:50:59 crc kubenswrapper[4786]: I0313 11:50:59.172750 4786 patch_prober.go:28] interesting pod/router-default-5444994796-hk9g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:50:59 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 11:50:59 crc kubenswrapper[4786]: [+]process-running ok Mar 13 11:50:59 crc kubenswrapper[4786]: healthz check failed Mar 13 11:50:59 crc kubenswrapper[4786]: I0313 11:50:59.172811 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hk9g4" podUID="f4eaf640-4e83-4528-ac9b-52a663fd5f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:00 crc kubenswrapper[4786]: I0313 11:51:00.172455 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:51:00 crc kubenswrapper[4786]: I0313 11:51:00.176447 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hk9g4" Mar 13 11:51:01 crc kubenswrapper[4786]: I0313 11:51:01.719096 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd9b66b4b-sn528"] Mar 13 11:51:01 crc kubenswrapper[4786]: I0313 11:51:01.719517 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerName="controller-manager" containerID="cri-o://2803d206aebd07015bbe65c88f09e52f90b20dbd6603bbb10c0e35bfd8197ac7" gracePeriod=30 Mar 13 11:51:01 crc kubenswrapper[4786]: I0313 11:51:01.754340 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff"] Mar 13 11:51:04 crc kubenswrapper[4786]: I0313 11:51:04.118035 4786 patch_prober.go:28] interesting pod/controller-manager-bd9b66b4b-sn528 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Mar 13 11:51:04 crc kubenswrapper[4786]: I0313 11:51:04.118368 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Mar 13 11:51:04 crc kubenswrapper[4786]: I0313 11:51:04.347346 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:51:04 crc kubenswrapper[4786]: I0313 11:51:04.912988 4786 generic.go:334] "Generic (PLEG): container finished" podID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerID="2803d206aebd07015bbe65c88f09e52f90b20dbd6603bbb10c0e35bfd8197ac7" exitCode=0 Mar 13 11:51:04 crc kubenswrapper[4786]: I0313 11:51:04.913044 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" event={"ID":"25fb55fe-bd61-406c-8bc2-c7963fabd631","Type":"ContainerDied","Data":"2803d206aebd07015bbe65c88f09e52f90b20dbd6603bbb10c0e35bfd8197ac7"} Mar 13 11:51:05 crc kubenswrapper[4786]: E0313 11:51:05.986479 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 11:51:05 crc kubenswrapper[4786]: E0313 11:51:05.986965 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9w9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4tp7g_openshift-marketplace(939749d8-2927-47a2-8edc-77b4f307e813): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:51:05 crc kubenswrapper[4786]: E0313 11:51:05.988818 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4tp7g" podUID="939749d8-2927-47a2-8edc-77b4f307e813" Mar 13 11:51:06 crc kubenswrapper[4786]: I0313 11:51:06.368431 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:06 crc kubenswrapper[4786]: I0313 11:51:06.860558 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:51:06 crc kubenswrapper[4786]: I0313 11:51:06.863744 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 11:51:07 crc kubenswrapper[4786]: E0313 11:51:07.567418 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4tp7g" podUID="939749d8-2927-47a2-8edc-77b4f307e813" Mar 13 11:51:08 crc kubenswrapper[4786]: I0313 11:51:08.169052 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:51:08 crc kubenswrapper[4786]: I0313 11:51:08.169140 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:51:09 crc kubenswrapper[4786]: W0313 11:51:09.327594 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod180ce447_e137_492a_bccd_f40344492a31.slice/crio-0af8c3ea1c61c50b6b9ee482c65a75760e9ea7903b6f4a1b072a3caf0cc3bb53 WatchSource:0}: Error finding container 0af8c3ea1c61c50b6b9ee482c65a75760e9ea7903b6f4a1b072a3caf0cc3bb53: Status 404 returned error can't find the container with id 0af8c3ea1c61c50b6b9ee482c65a75760e9ea7903b6f4a1b072a3caf0cc3bb53 Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.370255 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.411928 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.412297 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dzfjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-94x7m_openshift-marketplace(f8353c7b-cabe-46a6-8a98-aea4bad6b499): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.412647 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-647f6f76fb-cd8wq"] Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.412864 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerName="controller-manager" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.412889 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerName="controller-manager" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.412994 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" containerName="controller-manager" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.413343 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.413866 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-94x7m" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.421008 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-647f6f76fb-cd8wq"] Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.437723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb55fe-bd61-406c-8bc2-c7963fabd631-serving-cert\") pod \"25fb55fe-bd61-406c-8bc2-c7963fabd631\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.437835 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-client-ca\") pod \"25fb55fe-bd61-406c-8bc2-c7963fabd631\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.437900 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-proxy-ca-bundles\") pod \"25fb55fe-bd61-406c-8bc2-c7963fabd631\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.437923 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqv8v\" (UniqueName: \"kubernetes.io/projected/25fb55fe-bd61-406c-8bc2-c7963fabd631-kube-api-access-vqv8v\") pod \"25fb55fe-bd61-406c-8bc2-c7963fabd631\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.437953 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-config\") pod \"25fb55fe-bd61-406c-8bc2-c7963fabd631\" (UID: \"25fb55fe-bd61-406c-8bc2-c7963fabd631\") " Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.439357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-client-ca" (OuterVolumeSpecName: "client-ca") pod "25fb55fe-bd61-406c-8bc2-c7963fabd631" (UID: "25fb55fe-bd61-406c-8bc2-c7963fabd631"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.439433 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-config" (OuterVolumeSpecName: "config") pod "25fb55fe-bd61-406c-8bc2-c7963fabd631" (UID: "25fb55fe-bd61-406c-8bc2-c7963fabd631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.439832 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25fb55fe-bd61-406c-8bc2-c7963fabd631" (UID: "25fb55fe-bd61-406c-8bc2-c7963fabd631"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.444446 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25fb55fe-bd61-406c-8bc2-c7963fabd631-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25fb55fe-bd61-406c-8bc2-c7963fabd631" (UID: "25fb55fe-bd61-406c-8bc2-c7963fabd631"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.444906 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fb55fe-bd61-406c-8bc2-c7963fabd631-kube-api-access-vqv8v" (OuterVolumeSpecName: "kube-api-access-vqv8v") pod "25fb55fe-bd61-406c-8bc2-c7963fabd631" (UID: "25fb55fe-bd61-406c-8bc2-c7963fabd631"). InnerVolumeSpecName "kube-api-access-vqv8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.496849 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.497019 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7zj65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9bkf6_openshift-marketplace(1ca57952-a8b4-45bc-bf5a-1ddd025835c9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.498181 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9bkf6" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.523597 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.523936 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5c2rc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f4982_openshift-marketplace(be9e61e1-45b9-42e3-899f-495a710537fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:51:09 crc kubenswrapper[4786]: E0313 11:51:09.526253 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f4982" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.538778 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9deba0-f433-4584-bb71-431d7d4648d2-serving-cert\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.538834 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-client-ca\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.538874 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72zh\" (UniqueName: \"kubernetes.io/projected/4f9deba0-f433-4584-bb71-431d7d4648d2-kube-api-access-x72zh\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.538909 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-proxy-ca-bundles\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.538927 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-config\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.538973 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.539027 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.539086 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqv8v\" (UniqueName: \"kubernetes.io/projected/25fb55fe-bd61-406c-8bc2-c7963fabd631-kube-api-access-vqv8v\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.539099 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb55fe-bd61-406c-8bc2-c7963fabd631-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.539109 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb55fe-bd61-406c-8bc2-c7963fabd631-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.641146 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-config\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.641262 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9deba0-f433-4584-bb71-431d7d4648d2-serving-cert\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.641316 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-client-ca\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.641374 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x72zh\" (UniqueName: \"kubernetes.io/projected/4f9deba0-f433-4584-bb71-431d7d4648d2-kube-api-access-x72zh\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.641406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-proxy-ca-bundles\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.643509 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-client-ca\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.644386 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-proxy-ca-bundles\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.646025 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9deba0-f433-4584-bb71-431d7d4648d2-serving-cert\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.646989 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xftvt"] Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.647851 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-config\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.665355 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x72zh\" (UniqueName: \"kubernetes.io/projected/4f9deba0-f433-4584-bb71-431d7d4648d2-kube-api-access-x72zh\") pod \"controller-manager-647f6f76fb-cd8wq\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: W0313 11:51:09.666454 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b7548d_0202_4690_b267_90076b5e4687.slice/crio-276c25a245a1f429b1f571ad21eb54e2b9bae336e06c670ded731d3a8286378a WatchSource:0}: Error finding container 276c25a245a1f429b1f571ad21eb54e2b9bae336e06c670ded731d3a8286378a: Status 404 returned error can't find the container with id 276c25a245a1f429b1f571ad21eb54e2b9bae336e06c670ded731d3a8286378a Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.716214 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g4pzt"] Mar 13 11:51:09 crc kubenswrapper[4786]: W0313 11:51:09.724014 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19009bf_0d5a_458f_8c3e_97bc203741b1.slice/crio-134fdf9b9c6b05d314f40e60a13927dae669c7905774db2ec8cbdd0c1e2840b0 WatchSource:0}: Error finding container 134fdf9b9c6b05d314f40e60a13927dae669c7905774db2ec8cbdd0c1e2840b0: Status 404 returned error can't find the container with id 134fdf9b9c6b05d314f40e60a13927dae669c7905774db2ec8cbdd0c1e2840b0 Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.750180 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.772384 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gfdq"] Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.780075 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:09 crc kubenswrapper[4786]: W0313 11:51:09.800970 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17bbca1c_a838_4407_834c_45b6129b32b8.slice/crio-752fb6ed5fa450574193c3c8f1c4eb7256533cb3a6159d330dea19fb4c08255c WatchSource:0}: Error finding container 752fb6ed5fa450574193c3c8f1c4eb7256533cb3a6159d330dea19fb4c08255c: Status 404 returned error can't find the container with id 752fb6ed5fa450574193c3c8f1c4eb7256533cb3a6159d330dea19fb4c08255c Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.960443 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerStarted","Data":"752fb6ed5fa450574193c3c8f1c4eb7256533cb3a6159d330dea19fb4c08255c"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.960819 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-647f6f76fb-cd8wq"] Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.962642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" event={"ID":"c19009bf-0d5a-458f-8c3e-97bc203741b1","Type":"ContainerStarted","Data":"134fdf9b9c6b05d314f40e60a13927dae669c7905774db2ec8cbdd0c1e2840b0"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.964839 4786 generic.go:334] "Generic (PLEG): container finished" podID="1d680740-f193-4a69-8755-d766703cd61a" containerID="cf81364e364eb6f6cfcd50e6e48583773099a95819226540d6841a177e2131a9" exitCode=0 Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.964874 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmpwr" event={"ID":"1d680740-f193-4a69-8755-d766703cd61a","Type":"ContainerDied","Data":"cf81364e364eb6f6cfcd50e6e48583773099a95819226540d6841a177e2131a9"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.968595 4786 generic.go:334] "Generic (PLEG): container finished" podID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerID="5836a261ed75f6ab224fa1bc890f0a381017b9a010ca7e2fc7f5a63c99f2cb81" exitCode=0 Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.968633 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bjpz" event={"ID":"6cf9b878-214e-46cc-b417-49a01c7b5fc9","Type":"ContainerDied","Data":"5836a261ed75f6ab224fa1bc890f0a381017b9a010ca7e2fc7f5a63c99f2cb81"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.972844 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ad244ddf-d679-419f-9308-5584ea9b3e04","Type":"ContainerStarted","Data":"a0943dbbef48d291174d0f304dcc1e96bbda9972143008b9db5e8c29a17c98f2"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.975279 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" podUID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" containerName="route-controller-manager" containerID="cri-o://0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142" gracePeriod=30 Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.975356 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" event={"ID":"3ca20ab6-bb5c-45e0-b6d3-37725a376013","Type":"ContainerStarted","Data":"0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.975650 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.983415 4786 generic.go:334] "Generic (PLEG): container finished" podID="a6b7548d-0202-4690-b267-90076b5e4687" containerID="566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde" exitCode=0 Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.983489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerDied","Data":"566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.983518 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerStarted","Data":"276c25a245a1f429b1f571ad21eb54e2b9bae336e06c670ded731d3a8286378a"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.986318 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"180ce447-e137-492a-bccd-f40344492a31","Type":"ContainerStarted","Data":"dbcf52d8598211c3004a90c5a7b4432d6ff39d1e2071aa6f37983c38d761d373"} Mar 13 11:51:09 crc kubenswrapper[4786]: I0313 11:51:09.986349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"180ce447-e137-492a-bccd-f40344492a31","Type":"ContainerStarted","Data":"0af8c3ea1c61c50b6b9ee482c65a75760e9ea7903b6f4a1b072a3caf0cc3bb53"} Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.007911 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.007967 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9b66b4b-sn528" event={"ID":"25fb55fe-bd61-406c-8bc2-c7963fabd631","Type":"ContainerDied","Data":"62f162165299dd04ee5cfb1a7e3c5405f75b0bb9c4cd46fd5d1a3f3d8598e106"} Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.008063 4786 scope.go:117] "RemoveContainer" containerID="2803d206aebd07015bbe65c88f09e52f90b20dbd6603bbb10c0e35bfd8197ac7" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.011575 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" podStartSLOduration=28.011553488 podStartE2EDuration="28.011553488s" podCreationTimestamp="2026-03-13 11:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:10.000471353 +0000 UTC m=+257.280124810" watchObservedRunningTime="2026-03-13 11:51:10.011553488 +0000 UTC m=+257.291206935" Mar 13 11:51:10 crc kubenswrapper[4786]: E0313 11:51:10.017464 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-94x7m" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" Mar 13 11:51:10 crc kubenswrapper[4786]: E0313 11:51:10.020821 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9bkf6" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" Mar 13 11:51:10 crc kubenswrapper[4786]: E0313 11:51:10.020951 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f4982" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.085325 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=24.085311344 podStartE2EDuration="24.085311344s" podCreationTimestamp="2026-03-13 11:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:10.084633996 +0000 UTC m=+257.364287433" watchObservedRunningTime="2026-03-13 11:51:10.085311344 +0000 UTC m=+257.364964791" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.107932 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd9b66b4b-sn528"] Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.111231 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bd9b66b4b-sn528"] Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.128736 4786 patch_prober.go:28] interesting pod/route-controller-manager-698ff8b74-ksfff container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": read tcp 10.217.0.2:56500->10.217.0.52:8443: read: connection reset by peer" start-of-body= Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.129032 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" podUID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": read tcp 10.217.0.2:56500->10.217.0.52:8443: read: connection reset by peer" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.432304 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-698ff8b74-ksfff_3ca20ab6-bb5c-45e0-b6d3-37725a376013/route-controller-manager/0.log" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.432837 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.554070 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-config\") pod \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.554137 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca20ab6-bb5c-45e0-b6d3-37725a376013-serving-cert\") pod \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.554165 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-client-ca\") pod \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.554228 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tndkn\" (UniqueName: \"kubernetes.io/projected/3ca20ab6-bb5c-45e0-b6d3-37725a376013-kube-api-access-tndkn\") pod \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\" (UID: \"3ca20ab6-bb5c-45e0-b6d3-37725a376013\") " Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.554848 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ca20ab6-bb5c-45e0-b6d3-37725a376013" (UID: "3ca20ab6-bb5c-45e0-b6d3-37725a376013"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.554869 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-config" (OuterVolumeSpecName: "config") pod "3ca20ab6-bb5c-45e0-b6d3-37725a376013" (UID: "3ca20ab6-bb5c-45e0-b6d3-37725a376013"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.558779 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca20ab6-bb5c-45e0-b6d3-37725a376013-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ca20ab6-bb5c-45e0-b6d3-37725a376013" (UID: "3ca20ab6-bb5c-45e0-b6d3-37725a376013"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.563020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca20ab6-bb5c-45e0-b6d3-37725a376013-kube-api-access-tndkn" (OuterVolumeSpecName: "kube-api-access-tndkn") pod "3ca20ab6-bb5c-45e0-b6d3-37725a376013" (UID: "3ca20ab6-bb5c-45e0-b6d3-37725a376013"). InnerVolumeSpecName "kube-api-access-tndkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.655259 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.655531 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca20ab6-bb5c-45e0-b6d3-37725a376013-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.655540 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca20ab6-bb5c-45e0-b6d3-37725a376013-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:10 crc kubenswrapper[4786]: I0313 11:51:10.655549 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tndkn\" (UniqueName: \"kubernetes.io/projected/3ca20ab6-bb5c-45e0-b6d3-37725a376013-kube-api-access-tndkn\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.015628 4786 generic.go:334] "Generic (PLEG): container finished" podID="17bbca1c-a838-4407-834c-45b6129b32b8" containerID="f3011311a409ea03706c0bafec45ae414f9caa64b5fd14e8b44f9f67923514bd" exitCode=0 Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.015690 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerDied","Data":"f3011311a409ea03706c0bafec45ae414f9caa64b5fd14e8b44f9f67923514bd"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.022006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmpwr" event={"ID":"1d680740-f193-4a69-8755-d766703cd61a","Type":"ContainerStarted","Data":"acd494a68da9d6b486c6956128882e3b9fab934308e2a3e919c981d8f69b5245"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.023460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" event={"ID":"4f9deba0-f433-4584-bb71-431d7d4648d2","Type":"ContainerStarted","Data":"afa16fe4adf95ccac10b15882726c23afd32780d2ab7744d0e9bf2cf41727e76"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.023502 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" event={"ID":"4f9deba0-f433-4584-bb71-431d7d4648d2","Type":"ContainerStarted","Data":"73f109753e6644546190f5091b039d11c8af4697bae2ed2426e44301097159ae"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.023684 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.025062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" event={"ID":"c19009bf-0d5a-458f-8c3e-97bc203741b1","Type":"ContainerStarted","Data":"5239def8689584e90408c971f235571eeab122aee0e0532007c1d6f0c36966a4"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.025093 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g4pzt" event={"ID":"c19009bf-0d5a-458f-8c3e-97bc203741b1","Type":"ContainerStarted","Data":"e6b4e4688b8a0f9894cc80c05b5b22dd262a41262e24f5f08252b86d419312f3"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.026605 4786 generic.go:334] "Generic (PLEG): container finished" podID="ad244ddf-d679-419f-9308-5584ea9b3e04" containerID="39b8e44c227f85687cea9482b8bf06ffc8b6c0352c2529142108cfc80c4c4bd1" exitCode=0 Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.026669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ad244ddf-d679-419f-9308-5584ea9b3e04","Type":"ContainerDied","Data":"39b8e44c227f85687cea9482b8bf06ffc8b6c0352c2529142108cfc80c4c4bd1"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.028139 4786 generic.go:334] "Generic (PLEG): container finished" podID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerID="0dd40450eb04c83389c750fd5dae98e3c7ee1503a5c4b1bc4c3b5668c35505da" exitCode=0 Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.028184 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bjpz" event={"ID":"6cf9b878-214e-46cc-b417-49a01c7b5fc9","Type":"ContainerDied","Data":"0dd40450eb04c83389c750fd5dae98e3c7ee1503a5c4b1bc4c3b5668c35505da"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.035592 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.038825 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-698ff8b74-ksfff_3ca20ab6-bb5c-45e0-b6d3-37725a376013/route-controller-manager/0.log" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.038864 4786 generic.go:334] "Generic (PLEG): container finished" podID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" containerID="0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142" exitCode=255 Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.038994 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.039369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" event={"ID":"3ca20ab6-bb5c-45e0-b6d3-37725a376013","Type":"ContainerDied","Data":"0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.039402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff" event={"ID":"3ca20ab6-bb5c-45e0-b6d3-37725a376013","Type":"ContainerDied","Data":"a68bb67e003b3589879589a48cda527ad129980481ff511dd8a8baef160295fb"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.039455 4786 scope.go:117] "RemoveContainer" containerID="0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.044010 4786 generic.go:334] "Generic (PLEG): container finished" podID="180ce447-e137-492a-bccd-f40344492a31" containerID="dbcf52d8598211c3004a90c5a7b4432d6ff39d1e2071aa6f37983c38d761d373" exitCode=0 Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.044048 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"180ce447-e137-492a-bccd-f40344492a31","Type":"ContainerDied","Data":"dbcf52d8598211c3004a90c5a7b4432d6ff39d1e2071aa6f37983c38d761d373"} Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.057425 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g4pzt" podStartSLOduration=198.057403331 podStartE2EDuration="3m18.057403331s" podCreationTimestamp="2026-03-13 11:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:11.05082322 +0000 UTC m=+258.330476677" watchObservedRunningTime="2026-03-13 11:51:11.057403331 +0000 UTC m=+258.337056818" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.093606 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" podStartSLOduration=10.093591345 podStartE2EDuration="10.093591345s" podCreationTimestamp="2026-03-13 11:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:11.091346024 +0000 UTC m=+258.370999481" watchObservedRunningTime="2026-03-13 11:51:11.093591345 +0000 UTC m=+258.373244792" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.107239 4786 scope.go:117] "RemoveContainer" containerID="0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142" Mar 13 11:51:11 crc kubenswrapper[4786]: E0313 11:51:11.107601 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142\": container with ID starting with 0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142 not found: ID does not exist" containerID="0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.107650 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142"} err="failed to get container status \"0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142\": rpc error: code = NotFound desc = could not find container \"0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142\": container with ID starting with 0ea9cf5720337033c824b129b9390e54fced6108b59f7edfdd6bdf42f0b09142 not found: ID does not exist" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.136114 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kmpwr" podStartSLOduration=7.69466562 podStartE2EDuration="26.136093704s" podCreationTimestamp="2026-03-13 11:50:45 +0000 UTC" firstStartedPulling="2026-03-13 11:50:52.062513315 +0000 UTC m=+239.342166762" lastFinishedPulling="2026-03-13 11:51:10.503941399 +0000 UTC m=+257.783594846" observedRunningTime="2026-03-13 11:51:11.132547936 +0000 UTC m=+258.412201403" watchObservedRunningTime="2026-03-13 11:51:11.136093704 +0000 UTC m=+258.415747171" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.166537 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff"] Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.170386 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-ksfff"] Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.453700 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fb55fe-bd61-406c-8bc2-c7963fabd631" path="/var/lib/kubelet/pods/25fb55fe-bd61-406c-8bc2-c7963fabd631/volumes" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.454718 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" path="/var/lib/kubelet/pods/3ca20ab6-bb5c-45e0-b6d3-37725a376013/volumes" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.781071 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb"] Mar 13 11:51:11 crc kubenswrapper[4786]: E0313 11:51:11.781636 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" containerName="route-controller-manager" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.781656 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" containerName="route-controller-manager" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.781785 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca20ab6-bb5c-45e0-b6d3-37725a376013" containerName="route-controller-manager" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.782237 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.790646 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.790850 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.791034 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.791207 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.791917 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.791915 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.798354 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb"] Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.845409 4786 ???:1] "http: TLS handshake error from 192.168.126.11:51476: no serving certificate available for the kubelet" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.877091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5f4l\" (UniqueName: \"kubernetes.io/projected/526ecfd2-d5df-4724-bf36-bf17ebb355a6-kube-api-access-p5f4l\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.877143 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/526ecfd2-d5df-4724-bf36-bf17ebb355a6-serving-cert\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.877221 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-client-ca\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.877341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-config\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.978921 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-client-ca\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.979010 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-config\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.979046 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5f4l\" (UniqueName: \"kubernetes.io/projected/526ecfd2-d5df-4724-bf36-bf17ebb355a6-kube-api-access-p5f4l\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.979065 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/526ecfd2-d5df-4724-bf36-bf17ebb355a6-serving-cert\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.980561 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-client-ca\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.981004 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-config\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.985681 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/526ecfd2-d5df-4724-bf36-bf17ebb355a6-serving-cert\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:11 crc kubenswrapper[4786]: I0313 11:51:11.995268 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5f4l\" (UniqueName: \"kubernetes.io/projected/526ecfd2-d5df-4724-bf36-bf17ebb355a6-kube-api-access-p5f4l\") pod \"route-controller-manager-554bfc9fbc-94fpb\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.057771 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bjpz" event={"ID":"6cf9b878-214e-46cc-b417-49a01c7b5fc9","Type":"ContainerStarted","Data":"e3d723566c8e69b114354849629541bb9633acca15c47f8b8ea07c23b78eadd4"} Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.082550 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8bjpz" podStartSLOduration=25.382724388 podStartE2EDuration="27.082532805s" podCreationTimestamp="2026-03-13 11:50:45 +0000 UTC" firstStartedPulling="2026-03-13 11:51:09.971788334 +0000 UTC m=+257.251441781" lastFinishedPulling="2026-03-13 11:51:11.671596751 +0000 UTC m=+258.951250198" observedRunningTime="2026-03-13 11:51:12.078171856 +0000 UTC m=+259.357825313" watchObservedRunningTime="2026-03-13 11:51:12.082532805 +0000 UTC m=+259.362186252" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.151498 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.326910 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.391837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad244ddf-d679-419f-9308-5584ea9b3e04-kubelet-dir\") pod \"ad244ddf-d679-419f-9308-5584ea9b3e04\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.391911 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad244ddf-d679-419f-9308-5584ea9b3e04-kube-api-access\") pod \"ad244ddf-d679-419f-9308-5584ea9b3e04\" (UID: \"ad244ddf-d679-419f-9308-5584ea9b3e04\") " Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.391956 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad244ddf-d679-419f-9308-5584ea9b3e04-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ad244ddf-d679-419f-9308-5584ea9b3e04" (UID: "ad244ddf-d679-419f-9308-5584ea9b3e04"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.392152 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad244ddf-d679-419f-9308-5584ea9b3e04-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.393297 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.396166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad244ddf-d679-419f-9308-5584ea9b3e04-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ad244ddf-d679-419f-9308-5584ea9b3e04" (UID: "ad244ddf-d679-419f-9308-5584ea9b3e04"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.493073 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180ce447-e137-492a-bccd-f40344492a31-kubelet-dir\") pod \"180ce447-e137-492a-bccd-f40344492a31\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.493167 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180ce447-e137-492a-bccd-f40344492a31-kube-api-access\") pod \"180ce447-e137-492a-bccd-f40344492a31\" (UID: \"180ce447-e137-492a-bccd-f40344492a31\") " Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.493342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/180ce447-e137-492a-bccd-f40344492a31-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "180ce447-e137-492a-bccd-f40344492a31" (UID: "180ce447-e137-492a-bccd-f40344492a31"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.493780 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180ce447-e137-492a-bccd-f40344492a31-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.493798 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad244ddf-d679-419f-9308-5584ea9b3e04-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.497156 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180ce447-e137-492a-bccd-f40344492a31-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "180ce447-e137-492a-bccd-f40344492a31" (UID: "180ce447-e137-492a-bccd-f40344492a31"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.594662 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180ce447-e137-492a-bccd-f40344492a31-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:12 crc kubenswrapper[4786]: I0313 11:51:12.623467 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb"] Mar 13 11:51:12 crc kubenswrapper[4786]: W0313 11:51:12.635649 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526ecfd2_d5df_4724_bf36_bf17ebb355a6.slice/crio-bea098748fd8af02ae3cac77e3b3086e2d7ef1f5926f41e0cada555a3f881abf WatchSource:0}: Error finding container bea098748fd8af02ae3cac77e3b3086e2d7ef1f5926f41e0cada555a3f881abf: Status 404 returned error can't find the container with id bea098748fd8af02ae3cac77e3b3086e2d7ef1f5926f41e0cada555a3f881abf Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.065863 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" event={"ID":"526ecfd2-d5df-4724-bf36-bf17ebb355a6","Type":"ContainerStarted","Data":"9c7222814c3be4c613b9f61f79ffd61b0e15ea245e826c5b76b7a494ae4f1480"} Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.066242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" event={"ID":"526ecfd2-d5df-4724-bf36-bf17ebb355a6","Type":"ContainerStarted","Data":"bea098748fd8af02ae3cac77e3b3086e2d7ef1f5926f41e0cada555a3f881abf"} Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.066445 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.069400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ad244ddf-d679-419f-9308-5584ea9b3e04","Type":"ContainerDied","Data":"a0943dbbef48d291174d0f304dcc1e96bbda9972143008b9db5e8c29a17c98f2"} Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.069433 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0943dbbef48d291174d0f304dcc1e96bbda9972143008b9db5e8c29a17c98f2" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.069485 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.074560 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.075219 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"180ce447-e137-492a-bccd-f40344492a31","Type":"ContainerDied","Data":"0af8c3ea1c61c50b6b9ee482c65a75760e9ea7903b6f4a1b072a3caf0cc3bb53"} Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.075310 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af8c3ea1c61c50b6b9ee482c65a75760e9ea7903b6f4a1b072a3caf0cc3bb53" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.085434 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" podStartSLOduration=12.085416458 podStartE2EDuration="12.085416458s" podCreationTimestamp="2026-03-13 11:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:13.08074408 +0000 UTC m=+260.360397547" watchObservedRunningTime="2026-03-13 11:51:13.085416458 +0000 UTC m=+260.365069905" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.115148 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.852067 4786 csr.go:261] certificate signing request csr-bk4pk is approved, waiting to be issued Mar 13 11:51:13 crc kubenswrapper[4786]: I0313 11:51:13.857805 4786 csr.go:257] certificate signing request csr-bk4pk is issued Mar 13 11:51:14 crc kubenswrapper[4786]: I0313 11:51:14.082075 4786 generic.go:334] "Generic (PLEG): container finished" podID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" containerID="4adf9ddcb68c2f5c1789af829fd8b03591fc8522c100208f8726e9eb6c3d81b0" exitCode=0 Mar 13 11:51:14 crc kubenswrapper[4786]: I0313 11:51:14.082155 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" event={"ID":"01e610d5-a3b8-4fc8-a472-01ab5bb625d5","Type":"ContainerDied","Data":"4adf9ddcb68c2f5c1789af829fd8b03591fc8522c100208f8726e9eb6c3d81b0"} Mar 13 11:51:14 crc kubenswrapper[4786]: I0313 11:51:14.859149 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-08 20:33:04.913360919 +0000 UTC Mar 13 11:51:14 crc kubenswrapper[4786]: I0313 11:51:14.859185 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7232h41m50.054178659s for next certificate rotation Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.368444 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.435909 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5qp2\" (UniqueName: \"kubernetes.io/projected/01e610d5-a3b8-4fc8-a472-01ab5bb625d5-kube-api-access-q5qp2\") pod \"01e610d5-a3b8-4fc8-a472-01ab5bb625d5\" (UID: \"01e610d5-a3b8-4fc8-a472-01ab5bb625d5\") " Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.445411 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e610d5-a3b8-4fc8-a472-01ab5bb625d5-kube-api-access-q5qp2" (OuterVolumeSpecName: "kube-api-access-q5qp2") pod "01e610d5-a3b8-4fc8-a472-01ab5bb625d5" (UID: "01e610d5-a3b8-4fc8-a472-01ab5bb625d5"). InnerVolumeSpecName "kube-api-access-q5qp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.538077 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5qp2\" (UniqueName: \"kubernetes.io/projected/01e610d5-a3b8-4fc8-a472-01ab5bb625d5-kube-api-access-q5qp2\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.555743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.555859 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.755446 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.859991 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-10 10:25:59.663225349 +0000 UTC Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.860045 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6526h34m43.803183382s for next certificate rotation Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.987922 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:51:15 crc kubenswrapper[4786]: I0313 11:51:15.987963 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:51:16 crc kubenswrapper[4786]: I0313 11:51:16.027687 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:51:16 crc kubenswrapper[4786]: I0313 11:51:16.095178 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" event={"ID":"01e610d5-a3b8-4fc8-a472-01ab5bb625d5","Type":"ContainerDied","Data":"68826c95e2bbf9fff2963f8ce2b41ff813e94f66dfd2232b9a4b5d7c3dbbebfc"} Mar 13 11:51:16 crc kubenswrapper[4786]: I0313 11:51:16.095218 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-bmgbc" Mar 13 11:51:16 crc kubenswrapper[4786]: I0313 11:51:16.095225 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68826c95e2bbf9fff2963f8ce2b41ff813e94f66dfd2232b9a4b5d7c3dbbebfc" Mar 13 11:51:16 crc kubenswrapper[4786]: I0313 11:51:16.131532 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:51:16 crc kubenswrapper[4786]: I0313 11:51:16.133782 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:51:17 crc kubenswrapper[4786]: I0313 11:51:17.583215 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8ssmw" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.265636 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bjpz"] Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.266383 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8bjpz" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="registry-server" containerID="cri-o://e3d723566c8e69b114354849629541bb9633acca15c47f8b8ea07c23b78eadd4" gracePeriod=2 Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.665468 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:51:18 crc kubenswrapper[4786]: E0313 11:51:18.665815 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180ce447-e137-492a-bccd-f40344492a31" containerName="pruner" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.665894 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="180ce447-e137-492a-bccd-f40344492a31" containerName="pruner" Mar 13 11:51:18 crc kubenswrapper[4786]: E0313 11:51:18.665912 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" containerName="oc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.665918 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" containerName="oc" Mar 13 11:51:18 crc kubenswrapper[4786]: E0313 11:51:18.665929 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad244ddf-d679-419f-9308-5584ea9b3e04" containerName="pruner" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.665956 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad244ddf-d679-419f-9308-5584ea9b3e04" containerName="pruner" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.666155 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad244ddf-d679-419f-9308-5584ea9b3e04" containerName="pruner" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.666169 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" containerName="oc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.666205 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="180ce447-e137-492a-bccd-f40344492a31" containerName="pruner" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.666713 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.668564 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.668903 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.670384 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.787496 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b318237-0d61-4e7c-ae76-1d334b8c9952-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.787547 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b318237-0d61-4e7c-ae76-1d334b8c9952-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.889021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b318237-0d61-4e7c-ae76-1d334b8c9952-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.889066 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b318237-0d61-4e7c-ae76-1d334b8c9952-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.889172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b318237-0d61-4e7c-ae76-1d334b8c9952-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.912696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b318237-0d61-4e7c-ae76-1d334b8c9952-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:18 crc kubenswrapper[4786]: I0313 11:51:18.988035 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:19 crc kubenswrapper[4786]: I0313 11:51:19.110617 4786 generic.go:334] "Generic (PLEG): container finished" podID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerID="e3d723566c8e69b114354849629541bb9633acca15c47f8b8ea07c23b78eadd4" exitCode=0 Mar 13 11:51:19 crc kubenswrapper[4786]: I0313 11:51:19.110662 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bjpz" event={"ID":"6cf9b878-214e-46cc-b417-49a01c7b5fc9","Type":"ContainerDied","Data":"e3d723566c8e69b114354849629541bb9633acca15c47f8b8ea07c23b78eadd4"} Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.563697 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.609156 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-utilities\") pod \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.609199 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-catalog-content\") pod \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.609240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn27l\" (UniqueName: \"kubernetes.io/projected/6cf9b878-214e-46cc-b417-49a01c7b5fc9-kube-api-access-hn27l\") pod \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\" (UID: \"6cf9b878-214e-46cc-b417-49a01c7b5fc9\") " Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.610050 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-utilities" (OuterVolumeSpecName: "utilities") pod "6cf9b878-214e-46cc-b417-49a01c7b5fc9" (UID: "6cf9b878-214e-46cc-b417-49a01c7b5fc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.614985 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf9b878-214e-46cc-b417-49a01c7b5fc9-kube-api-access-hn27l" (OuterVolumeSpecName: "kube-api-access-hn27l") pod "6cf9b878-214e-46cc-b417-49a01c7b5fc9" (UID: "6cf9b878-214e-46cc-b417-49a01c7b5fc9"). InnerVolumeSpecName "kube-api-access-hn27l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.638638 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cf9b878-214e-46cc-b417-49a01c7b5fc9" (UID: "6cf9b878-214e-46cc-b417-49a01c7b5fc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.710352 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.710387 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf9b878-214e-46cc-b417-49a01c7b5fc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.710397 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn27l\" (UniqueName: \"kubernetes.io/projected/6cf9b878-214e-46cc-b417-49a01c7b5fc9-kube-api-access-hn27l\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:20 crc kubenswrapper[4786]: I0313 11:51:20.720416 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:51:20 crc kubenswrapper[4786]: W0313 11:51:20.723168 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b318237_0d61_4e7c_ae76_1d334b8c9952.slice/crio-6524f8d4bbb9ac57b70309d29ea1078c28d1c68bfbfaba31bc92e46f4dc8bef2 WatchSource:0}: Error finding container 6524f8d4bbb9ac57b70309d29ea1078c28d1c68bfbfaba31bc92e46f4dc8bef2: Status 404 returned error can't find the container with id 6524f8d4bbb9ac57b70309d29ea1078c28d1c68bfbfaba31bc92e46f4dc8bef2 Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.122812 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bjpz" event={"ID":"6cf9b878-214e-46cc-b417-49a01c7b5fc9","Type":"ContainerDied","Data":"e75ecc1523e9765f37010262ad1fc8c9f6b9708b298b3b0fc7169888779985c8"} Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.123201 4786 scope.go:117] "RemoveContainer" containerID="e3d723566c8e69b114354849629541bb9633acca15c47f8b8ea07c23b78eadd4" Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.122922 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bjpz" Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.148200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerStarted","Data":"931bdea50fca736cfc8b71a47e129ef79b58f00021abeb2b983a2c39e96fc451"} Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.155951 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerStarted","Data":"105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb"} Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.164578 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6b318237-0d61-4e7c-ae76-1d334b8c9952","Type":"ContainerStarted","Data":"bbfe955785409ace9fba6fc1268f2b0bd099eeae51b86bd0d7eb06bfe9f7014b"} Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.164656 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6b318237-0d61-4e7c-ae76-1d334b8c9952","Type":"ContainerStarted","Data":"6524f8d4bbb9ac57b70309d29ea1078c28d1c68bfbfaba31bc92e46f4dc8bef2"} Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.176625 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bjpz"] Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.177810 4786 scope.go:117] "RemoveContainer" containerID="0dd40450eb04c83389c750fd5dae98e3c7ee1503a5c4b1bc4c3b5668c35505da" Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.179386 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bjpz"] Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.214614 4786 scope.go:117] "RemoveContainer" containerID="5836a261ed75f6ab224fa1bc890f0a381017b9a010ca7e2fc7f5a63c99f2cb81" Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.214734 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.214712033 podStartE2EDuration="3.214712033s" podCreationTimestamp="2026-03-13 11:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:21.212737329 +0000 UTC m=+268.492390776" watchObservedRunningTime="2026-03-13 11:51:21.214712033 +0000 UTC m=+268.494365490" Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.459435 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" path="/var/lib/kubelet/pods/6cf9b878-214e-46cc-b417-49a01c7b5fc9/volumes" Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.672720 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-647f6f76fb-cd8wq"] Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.673187 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" podUID="4f9deba0-f433-4584-bb71-431d7d4648d2" containerName="controller-manager" containerID="cri-o://afa16fe4adf95ccac10b15882726c23afd32780d2ab7744d0e9bf2cf41727e76" gracePeriod=30 Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.763652 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb"] Mar 13 11:51:21 crc kubenswrapper[4786]: I0313 11:51:21.763925 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" podUID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" containerName="route-controller-manager" containerID="cri-o://9c7222814c3be4c613b9f61f79ffd61b0e15ea245e826c5b76b7a494ae4f1480" gracePeriod=30 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.152736 4786 patch_prober.go:28] interesting pod/route-controller-manager-554bfc9fbc-94fpb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.152936 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" podUID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.172250 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerID="f54b62a936ddb568297e742a5873b389c9eb8d8e8ad94efcb9bfb62808a55236" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.172339 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bkf6" event={"ID":"1ca57952-a8b4-45bc-bf5a-1ddd025835c9","Type":"ContainerDied","Data":"f54b62a936ddb568297e742a5873b389c9eb8d8e8ad94efcb9bfb62808a55236"} Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.175292 4786 generic.go:334] "Generic (PLEG): container finished" podID="17bbca1c-a838-4407-834c-45b6129b32b8" containerID="931bdea50fca736cfc8b71a47e129ef79b58f00021abeb2b983a2c39e96fc451" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.175336 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerDied","Data":"931bdea50fca736cfc8b71a47e129ef79b58f00021abeb2b983a2c39e96fc451"} Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.183090 4786 generic.go:334] "Generic (PLEG): container finished" podID="a6b7548d-0202-4690-b267-90076b5e4687" containerID="105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.183211 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerDied","Data":"105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb"} Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.185691 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b318237-0d61-4e7c-ae76-1d334b8c9952" containerID="bbfe955785409ace9fba6fc1268f2b0bd099eeae51b86bd0d7eb06bfe9f7014b" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.185837 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6b318237-0d61-4e7c-ae76-1d334b8c9952","Type":"ContainerDied","Data":"bbfe955785409ace9fba6fc1268f2b0bd099eeae51b86bd0d7eb06bfe9f7014b"} Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.187601 4786 generic.go:334] "Generic (PLEG): container finished" podID="939749d8-2927-47a2-8edc-77b4f307e813" containerID="7f9708bfe361fbd8f2c02fee29ecd426f04e52a177788f20f6e6fed6e33b2eb8" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.187673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tp7g" event={"ID":"939749d8-2927-47a2-8edc-77b4f307e813","Type":"ContainerDied","Data":"7f9708bfe361fbd8f2c02fee29ecd426f04e52a177788f20f6e6fed6e33b2eb8"} Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.194359 4786 generic.go:334] "Generic (PLEG): container finished" podID="4f9deba0-f433-4584-bb71-431d7d4648d2" containerID="afa16fe4adf95ccac10b15882726c23afd32780d2ab7744d0e9bf2cf41727e76" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.194421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" event={"ID":"4f9deba0-f433-4584-bb71-431d7d4648d2","Type":"ContainerDied","Data":"afa16fe4adf95ccac10b15882726c23afd32780d2ab7744d0e9bf2cf41727e76"} Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.197015 4786 generic.go:334] "Generic (PLEG): container finished" podID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" containerID="9c7222814c3be4c613b9f61f79ffd61b0e15ea245e826c5b76b7a494ae4f1480" exitCode=0 Mar 13 11:51:22 crc kubenswrapper[4786]: I0313 11:51:22.197065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" event={"ID":"526ecfd2-d5df-4724-bf36-bf17ebb355a6","Type":"ContainerDied","Data":"9c7222814c3be4c613b9f61f79ffd61b0e15ea245e826c5b76b7a494ae4f1480"} Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.457202 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:51:24 crc kubenswrapper[4786]: E0313 11:51:24.460435 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="extract-content" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.460463 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="extract-content" Mar 13 11:51:24 crc kubenswrapper[4786]: E0313 11:51:24.460480 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="registry-server" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.460487 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="registry-server" Mar 13 11:51:24 crc kubenswrapper[4786]: E0313 11:51:24.460507 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="extract-utilities" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.460517 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="extract-utilities" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.460641 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf9b878-214e-46cc-b417-49a01c7b5fc9" containerName="registry-server" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.461130 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.481324 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.559159 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.559274 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kube-api-access\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.559333 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-var-lock\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.660463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-var-lock\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.660542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.660610 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-var-lock\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.660631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kube-api-access\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.660730 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.679553 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kube-api-access\") pod \"installer-9-crc\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.785209 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.911779 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.916242 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.922852 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936104 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz"] Mar 13 11:51:24 crc kubenswrapper[4786]: E0313 11:51:24.936409 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b318237-0d61-4e7c-ae76-1d334b8c9952" containerName="pruner" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936432 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b318237-0d61-4e7c-ae76-1d334b8c9952" containerName="pruner" Mar 13 11:51:24 crc kubenswrapper[4786]: E0313 11:51:24.936445 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" containerName="route-controller-manager" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936454 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" containerName="route-controller-manager" Mar 13 11:51:24 crc kubenswrapper[4786]: E0313 11:51:24.936471 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9deba0-f433-4584-bb71-431d7d4648d2" containerName="controller-manager" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936480 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9deba0-f433-4584-bb71-431d7d4648d2" containerName="controller-manager" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936598 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b318237-0d61-4e7c-ae76-1d334b8c9952" containerName="pruner" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936616 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" containerName="route-controller-manager" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.936627 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9deba0-f433-4584-bb71-431d7d4648d2" containerName="controller-manager" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.937151 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.952606 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz"] Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-proxy-ca-bundles\") pod \"4f9deba0-f433-4584-bb71-431d7d4648d2\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970424 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-client-ca\") pod \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970465 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b318237-0d61-4e7c-ae76-1d334b8c9952-kube-api-access\") pod \"6b318237-0d61-4e7c-ae76-1d334b8c9952\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970495 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9deba0-f433-4584-bb71-431d7d4648d2-serving-cert\") pod \"4f9deba0-f433-4584-bb71-431d7d4648d2\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970545 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x72zh\" (UniqueName: \"kubernetes.io/projected/4f9deba0-f433-4584-bb71-431d7d4648d2-kube-api-access-x72zh\") pod \"4f9deba0-f433-4584-bb71-431d7d4648d2\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970569 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b318237-0d61-4e7c-ae76-1d334b8c9952-kubelet-dir\") pod \"6b318237-0d61-4e7c-ae76-1d334b8c9952\" (UID: \"6b318237-0d61-4e7c-ae76-1d334b8c9952\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.970596 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-config\") pod \"4f9deba0-f433-4584-bb71-431d7d4648d2\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.971300 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "526ecfd2-d5df-4724-bf36-bf17ebb355a6" (UID: "526ecfd2-d5df-4724-bf36-bf17ebb355a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.971360 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b318237-0d61-4e7c-ae76-1d334b8c9952-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6b318237-0d61-4e7c-ae76-1d334b8c9952" (UID: "6b318237-0d61-4e7c-ae76-1d334b8c9952"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.971378 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4f9deba0-f433-4584-bb71-431d7d4648d2" (UID: "4f9deba0-f433-4584-bb71-431d7d4648d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.972102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-config" (OuterVolumeSpecName: "config") pod "4f9deba0-f433-4584-bb71-431d7d4648d2" (UID: "4f9deba0-f433-4584-bb71-431d7d4648d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.972572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5f4l\" (UniqueName: \"kubernetes.io/projected/526ecfd2-d5df-4724-bf36-bf17ebb355a6-kube-api-access-p5f4l\") pod \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.972609 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-config\") pod \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.972706 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-client-ca\") pod \"4f9deba0-f433-4584-bb71-431d7d4648d2\" (UID: \"4f9deba0-f433-4584-bb71-431d7d4648d2\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.972786 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/526ecfd2-d5df-4724-bf36-bf17ebb355a6-serving-cert\") pod \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\" (UID: \"526ecfd2-d5df-4724-bf36-bf17ebb355a6\") " Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.973449 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "4f9deba0-f433-4584-bb71-431d7d4648d2" (UID: "4f9deba0-f433-4584-bb71-431d7d4648d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.973459 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-config" (OuterVolumeSpecName: "config") pod "526ecfd2-d5df-4724-bf36-bf17ebb355a6" (UID: "526ecfd2-d5df-4724-bf36-bf17ebb355a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.974337 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b318237-0d61-4e7c-ae76-1d334b8c9952-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.974364 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.974378 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.974391 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.974403 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f9deba0-f433-4584-bb71-431d7d4648d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.974416 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/526ecfd2-d5df-4724-bf36-bf17ebb355a6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.976376 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9deba0-f433-4584-bb71-431d7d4648d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4f9deba0-f433-4584-bb71-431d7d4648d2" (UID: "4f9deba0-f433-4584-bb71-431d7d4648d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.976649 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b318237-0d61-4e7c-ae76-1d334b8c9952-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6b318237-0d61-4e7c-ae76-1d334b8c9952" (UID: "6b318237-0d61-4e7c-ae76-1d334b8c9952"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.978867 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/526ecfd2-d5df-4724-bf36-bf17ebb355a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "526ecfd2-d5df-4724-bf36-bf17ebb355a6" (UID: "526ecfd2-d5df-4724-bf36-bf17ebb355a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.991253 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526ecfd2-d5df-4724-bf36-bf17ebb355a6-kube-api-access-p5f4l" (OuterVolumeSpecName: "kube-api-access-p5f4l") pod "526ecfd2-d5df-4724-bf36-bf17ebb355a6" (UID: "526ecfd2-d5df-4724-bf36-bf17ebb355a6"). InnerVolumeSpecName "kube-api-access-p5f4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:24 crc kubenswrapper[4786]: I0313 11:51:24.992708 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9deba0-f433-4584-bb71-431d7d4648d2-kube-api-access-x72zh" (OuterVolumeSpecName: "kube-api-access-x72zh") pod "4f9deba0-f433-4584-bb71-431d7d4648d2" (UID: "4f9deba0-f433-4584-bb71-431d7d4648d2"). InnerVolumeSpecName "kube-api-access-x72zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075529 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-serving-cert\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075582 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-client-ca\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-config\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075669 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5s24\" (UniqueName: \"kubernetes.io/projected/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-kube-api-access-n5s24\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075841 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/526ecfd2-d5df-4724-bf36-bf17ebb355a6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075860 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b318237-0d61-4e7c-ae76-1d334b8c9952-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075873 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9deba0-f433-4584-bb71-431d7d4648d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075903 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x72zh\" (UniqueName: \"kubernetes.io/projected/4f9deba0-f433-4584-bb71-431d7d4648d2-kube-api-access-x72zh\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.075915 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5f4l\" (UniqueName: \"kubernetes.io/projected/526ecfd2-d5df-4724-bf36-bf17ebb355a6-kube-api-access-p5f4l\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.176660 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-serving-cert\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.176991 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-client-ca\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.177012 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-config\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.177045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5s24\" (UniqueName: \"kubernetes.io/projected/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-kube-api-access-n5s24\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.178005 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-client-ca\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.178289 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-config\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.182398 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-serving-cert\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.193350 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5s24\" (UniqueName: \"kubernetes.io/projected/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-kube-api-access-n5s24\") pod \"route-controller-manager-77bb897b76-6jqdz\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.226227 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" event={"ID":"526ecfd2-d5df-4724-bf36-bf17ebb355a6","Type":"ContainerDied","Data":"bea098748fd8af02ae3cac77e3b3086e2d7ef1f5926f41e0cada555a3f881abf"} Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.226303 4786 scope.go:117] "RemoveContainer" containerID="9c7222814c3be4c613b9f61f79ffd61b0e15ea245e826c5b76b7a494ae4f1480" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.226308 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.232196 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6b318237-0d61-4e7c-ae76-1d334b8c9952","Type":"ContainerDied","Data":"6524f8d4bbb9ac57b70309d29ea1078c28d1c68bfbfaba31bc92e46f4dc8bef2"} Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.232226 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.232235 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6524f8d4bbb9ac57b70309d29ea1078c28d1c68bfbfaba31bc92e46f4dc8bef2" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.233625 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" event={"ID":"4f9deba0-f433-4584-bb71-431d7d4648d2","Type":"ContainerDied","Data":"73f109753e6644546190f5091b039d11c8af4697bae2ed2426e44301097159ae"} Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.233690 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-647f6f76fb-cd8wq" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.269122 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb"] Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.271421 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.274538 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554bfc9fbc-94fpb"] Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.277837 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-647f6f76fb-cd8wq"] Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.280041 4786 scope.go:117] "RemoveContainer" containerID="afa16fe4adf95ccac10b15882726c23afd32780d2ab7744d0e9bf2cf41727e76" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.282995 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-647f6f76fb-cd8wq"] Mar 13 11:51:25 crc kubenswrapper[4786]: E0313 11:51:25.285466 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod6b318237_0d61_4e7c_ae76_1d334b8c9952.slice\": RecentStats: unable to find data in memory cache]" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.460171 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9deba0-f433-4584-bb71-431d7d4648d2" path="/var/lib/kubelet/pods/4f9deba0-f433-4584-bb71-431d7d4648d2/volumes" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.461406 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526ecfd2-d5df-4724-bf36-bf17ebb355a6" path="/var/lib/kubelet/pods/526ecfd2-d5df-4724-bf36-bf17ebb355a6/volumes" Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.693689 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:51:25 crc kubenswrapper[4786]: I0313 11:51:25.793924 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz"] Mar 13 11:51:25 crc kubenswrapper[4786]: W0313 11:51:25.798995 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f78cf2_9cd7_4574_9f50_2053cab39a5d.slice/crio-f495a56e98492d16ce68dbf7fcf6cd7d36ec143f7b192f50e683ee70efa596da WatchSource:0}: Error finding container f495a56e98492d16ce68dbf7fcf6cd7d36ec143f7b192f50e683ee70efa596da: Status 404 returned error can't find the container with id f495a56e98492d16ce68dbf7fcf6cd7d36ec143f7b192f50e683ee70efa596da Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.241271 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62","Type":"ContainerStarted","Data":"d332a85de0f5274a1eb6c3aafe0f79357d5ab8783df86a66fa4866033e4b4aa5"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.245098 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerStarted","Data":"d683800f8751c675fed57a968f8be2ca7dd6fbbabd9392bd02e31079970554d1"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.248203 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerStarted","Data":"72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.251244 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tp7g" event={"ID":"939749d8-2927-47a2-8edc-77b4f307e813","Type":"ContainerStarted","Data":"4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.252740 4786 generic.go:334] "Generic (PLEG): container finished" podID="be9e61e1-45b9-42e3-899f-495a710537fc" containerID="b607280f98744618d5fa4830e7cb1e122be742efc698c6e0b278ac1ffb95d2d9" exitCode=0 Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.252766 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4982" event={"ID":"be9e61e1-45b9-42e3-899f-495a710537fc","Type":"ContainerDied","Data":"b607280f98744618d5fa4830e7cb1e122be742efc698c6e0b278ac1ffb95d2d9"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.253822 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" event={"ID":"d3f78cf2-9cd7-4574-9f50-2053cab39a5d","Type":"ContainerStarted","Data":"f495a56e98492d16ce68dbf7fcf6cd7d36ec143f7b192f50e683ee70efa596da"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.256780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bkf6" event={"ID":"1ca57952-a8b4-45bc-bf5a-1ddd025835c9","Type":"ContainerStarted","Data":"13ec4a5cfd6d6cf13ec57f36715552e840a0b9c7235367b6cbf28a0526f67a53"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.258915 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerStarted","Data":"a1c805c41c287573eb6a6e548fdc662c312406b3b3e4b54ab80931782ddb58e7"} Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.306061 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9bkf6" podStartSLOduration=2.666939523 podStartE2EDuration="43.306042202s" podCreationTimestamp="2026-03-13 11:50:43 +0000 UTC" firstStartedPulling="2026-03-13 11:50:44.735287557 +0000 UTC m=+232.014940994" lastFinishedPulling="2026-03-13 11:51:25.374390226 +0000 UTC m=+272.654043673" observedRunningTime="2026-03-13 11:51:26.292991129 +0000 UTC m=+273.572644596" watchObservedRunningTime="2026-03-13 11:51:26.306042202 +0000 UTC m=+273.585695649" Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.330818 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4gfdq" podStartSLOduration=25.988175947 podStartE2EDuration="40.330797792s" podCreationTimestamp="2026-03-13 11:50:46 +0000 UTC" firstStartedPulling="2026-03-13 11:51:11.017122904 +0000 UTC m=+258.296776361" lastFinishedPulling="2026-03-13 11:51:25.359744759 +0000 UTC m=+272.639398206" observedRunningTime="2026-03-13 11:51:26.307346357 +0000 UTC m=+273.586999804" watchObservedRunningTime="2026-03-13 11:51:26.330797792 +0000 UTC m=+273.610451239" Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.332042 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xftvt" podStartSLOduration=24.97102704 podStartE2EDuration="40.332036535s" podCreationTimestamp="2026-03-13 11:50:46 +0000 UTC" firstStartedPulling="2026-03-13 11:51:09.998611351 +0000 UTC m=+257.278264798" lastFinishedPulling="2026-03-13 11:51:25.359620836 +0000 UTC m=+272.639274293" observedRunningTime="2026-03-13 11:51:26.330790222 +0000 UTC m=+273.610443689" watchObservedRunningTime="2026-03-13 11:51:26.332036535 +0000 UTC m=+273.611689992" Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.802130 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:51:26 crc kubenswrapper[4786]: I0313 11:51:26.802166 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.176659 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.176934 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.265533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" event={"ID":"d3f78cf2-9cd7-4574-9f50-2053cab39a5d","Type":"ContainerStarted","Data":"96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7"} Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.265743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.267161 4786 generic.go:334] "Generic (PLEG): container finished" podID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerID="d683800f8751c675fed57a968f8be2ca7dd6fbbabd9392bd02e31079970554d1" exitCode=0 Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.267242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerDied","Data":"d683800f8751c675fed57a968f8be2ca7dd6fbbabd9392bd02e31079970554d1"} Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.271643 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4982" event={"ID":"be9e61e1-45b9-42e3-899f-495a710537fc","Type":"ContainerStarted","Data":"f8ebe6486480758e4ce1344261ea9c1c54d72f291585c2816bb2bf84aa5d970e"} Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.272847 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62","Type":"ContainerStarted","Data":"2b9922be607248f1c765ddd2032e8fe62e1d0161372782345f9f6f14eee36643"} Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.273668 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.284761 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" podStartSLOduration=6.284738001 podStartE2EDuration="6.284738001s" podCreationTimestamp="2026-03-13 11:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:27.281426222 +0000 UTC m=+274.561079699" watchObservedRunningTime="2026-03-13 11:51:27.284738001 +0000 UTC m=+274.564391458" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.286009 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4tp7g" podStartSLOduration=3.463850556 podStartE2EDuration="44.286000005s" podCreationTimestamp="2026-03-13 11:50:43 +0000 UTC" firstStartedPulling="2026-03-13 11:50:44.715608286 +0000 UTC m=+231.995261733" lastFinishedPulling="2026-03-13 11:51:25.537757735 +0000 UTC m=+272.817411182" observedRunningTime="2026-03-13 11:51:26.381196095 +0000 UTC m=+273.660849552" watchObservedRunningTime="2026-03-13 11:51:27.286000005 +0000 UTC m=+274.565653462" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.298860 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.298845302 podStartE2EDuration="3.298845302s" podCreationTimestamp="2026-03-13 11:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:27.296835239 +0000 UTC m=+274.576488676" watchObservedRunningTime="2026-03-13 11:51:27.298845302 +0000 UTC m=+274.578498749" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.356543 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f4982" podStartSLOduration=2.203378936 podStartE2EDuration="44.356519433s" podCreationTimestamp="2026-03-13 11:50:43 +0000 UTC" firstStartedPulling="2026-03-13 11:50:44.721117868 +0000 UTC m=+232.000771315" lastFinishedPulling="2026-03-13 11:51:26.874258365 +0000 UTC m=+274.153911812" observedRunningTime="2026-03-13 11:51:27.354050237 +0000 UTC m=+274.633703694" watchObservedRunningTime="2026-03-13 11:51:27.356519433 +0000 UTC m=+274.636172900" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.791365 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74d94c9989-78pl2"] Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.792185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.796039 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.797005 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.797796 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.798850 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.799647 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.799716 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.803666 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74d94c9989-78pl2"] Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.809825 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.846062 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4gfdq" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="registry-server" probeResult="failure" output=< Mar 13 11:51:27 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 11:51:27 crc kubenswrapper[4786]: > Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.920163 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-config\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.920551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-client-ca\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.920596 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fd8c54-fe62-4644-9cce-ec5bda577457-serving-cert\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.920645 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rsk5\" (UniqueName: \"kubernetes.io/projected/29fd8c54-fe62-4644-9cce-ec5bda577457-kube-api-access-5rsk5\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:27 crc kubenswrapper[4786]: I0313 11:51:27.920721 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-proxy-ca-bundles\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.022344 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-proxy-ca-bundles\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.022669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-config\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.022758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-client-ca\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.022839 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fd8c54-fe62-4644-9cce-ec5bda577457-serving-cert\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.022950 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rsk5\" (UniqueName: \"kubernetes.io/projected/29fd8c54-fe62-4644-9cce-ec5bda577457-kube-api-access-5rsk5\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.023396 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-proxy-ca-bundles\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.023669 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-client-ca\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.024062 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-config\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.029040 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fd8c54-fe62-4644-9cce-ec5bda577457-serving-cert\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.040713 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rsk5\" (UniqueName: \"kubernetes.io/projected/29fd8c54-fe62-4644-9cce-ec5bda577457-kube-api-access-5rsk5\") pod \"controller-manager-74d94c9989-78pl2\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.106387 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.220780 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xftvt" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="registry-server" probeResult="failure" output=< Mar 13 11:51:28 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 11:51:28 crc kubenswrapper[4786]: > Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.286252 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerStarted","Data":"3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89"} Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.571665 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94x7m" podStartSLOduration=2.629322905 podStartE2EDuration="45.571647129s" podCreationTimestamp="2026-03-13 11:50:43 +0000 UTC" firstStartedPulling="2026-03-13 11:50:44.732146871 +0000 UTC m=+232.011800318" lastFinishedPulling="2026-03-13 11:51:27.674471105 +0000 UTC m=+274.954124542" observedRunningTime="2026-03-13 11:51:28.304240214 +0000 UTC m=+275.583893671" watchObservedRunningTime="2026-03-13 11:51:28.571647129 +0000 UTC m=+275.851300576" Mar 13 11:51:28 crc kubenswrapper[4786]: I0313 11:51:28.573103 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74d94c9989-78pl2"] Mar 13 11:51:29 crc kubenswrapper[4786]: I0313 11:51:29.291653 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" event={"ID":"29fd8c54-fe62-4644-9cce-ec5bda577457","Type":"ContainerStarted","Data":"8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e"} Mar 13 11:51:29 crc kubenswrapper[4786]: I0313 11:51:29.291698 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" event={"ID":"29fd8c54-fe62-4644-9cce-ec5bda577457","Type":"ContainerStarted","Data":"aa1a93847f8a9d931e1e72b6de4023803dcfea3750958395980ce0420da99294"} Mar 13 11:51:29 crc kubenswrapper[4786]: I0313 11:51:29.306759 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" podStartSLOduration=8.306740337 podStartE2EDuration="8.306740337s" podCreationTimestamp="2026-03-13 11:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:29.303943971 +0000 UTC m=+276.583597428" watchObservedRunningTime="2026-03-13 11:51:29.306740337 +0000 UTC m=+276.586393784" Mar 13 11:51:30 crc kubenswrapper[4786]: I0313 11:51:30.296387 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:30 crc kubenswrapper[4786]: I0313 11:51:30.300796 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:33 crc kubenswrapper[4786]: I0313 11:51:33.605609 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:51:33 crc kubenswrapper[4786]: I0313 11:51:33.605689 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:33.686249 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:33.770695 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:33.770842 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:33.846785 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:33.965914 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:33.966018 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.023470 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.160138 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.160188 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.236324 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.387742 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.388132 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.391831 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:51:34 crc kubenswrapper[4786]: I0313 11:51:34.403177 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:51:35 crc kubenswrapper[4786]: I0313 11:51:35.530963 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4982"] Mar 13 11:51:36 crc kubenswrapper[4786]: I0313 11:51:36.531323 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9bkf6"] Mar 13 11:51:36 crc kubenswrapper[4786]: I0313 11:51:36.531607 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9bkf6" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="registry-server" containerID="cri-o://13ec4a5cfd6d6cf13ec57f36715552e840a0b9c7235367b6cbf28a0526f67a53" gracePeriod=2 Mar 13 11:51:36 crc kubenswrapper[4786]: I0313 11:51:36.870238 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:51:36 crc kubenswrapper[4786]: I0313 11:51:36.934372 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:51:37 crc kubenswrapper[4786]: I0313 11:51:37.232925 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:51:37 crc kubenswrapper[4786]: I0313 11:51:37.303069 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:51:37 crc kubenswrapper[4786]: I0313 11:51:37.349045 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f4982" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="registry-server" containerID="cri-o://f8ebe6486480758e4ce1344261ea9c1c54d72f291585c2816bb2bf84aa5d970e" gracePeriod=2 Mar 13 11:51:38 crc kubenswrapper[4786]: I0313 11:51:38.169870 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4786]: I0313 11:51:38.170563 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:51:38 crc kubenswrapper[4786]: I0313 11:51:38.170654 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:51:38 crc kubenswrapper[4786]: I0313 11:51:38.172316 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:51:38 crc kubenswrapper[4786]: I0313 11:51:38.172435 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3" gracePeriod=600 Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.363427 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.363739 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3"} Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.366039 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerID="13ec4a5cfd6d6cf13ec57f36715552e840a0b9c7235367b6cbf28a0526f67a53" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.366082 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bkf6" event={"ID":"1ca57952-a8b4-45bc-bf5a-1ddd025835c9","Type":"ContainerDied","Data":"13ec4a5cfd6d6cf13ec57f36715552e840a0b9c7235367b6cbf28a0526f67a53"} Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.367766 4786 generic.go:334] "Generic (PLEG): container finished" podID="be9e61e1-45b9-42e3-899f-495a710537fc" containerID="f8ebe6486480758e4ce1344261ea9c1c54d72f291585c2816bb2bf84aa5d970e" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.367795 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4982" event={"ID":"be9e61e1-45b9-42e3-899f-495a710537fc","Type":"ContainerDied","Data":"f8ebe6486480758e4ce1344261ea9c1c54d72f291585c2816bb2bf84aa5d970e"} Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.510790 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.517791 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.661123 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-catalog-content\") pod \"be9e61e1-45b9-42e3-899f-495a710537fc\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.661232 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-utilities\") pod \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.661298 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zj65\" (UniqueName: \"kubernetes.io/projected/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-kube-api-access-7zj65\") pod \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.661385 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-catalog-content\") pod \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\" (UID: \"1ca57952-a8b4-45bc-bf5a-1ddd025835c9\") " Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.661426 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-utilities\") pod \"be9e61e1-45b9-42e3-899f-495a710537fc\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.661447 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c2rc\" (UniqueName: \"kubernetes.io/projected/be9e61e1-45b9-42e3-899f-495a710537fc-kube-api-access-5c2rc\") pod \"be9e61e1-45b9-42e3-899f-495a710537fc\" (UID: \"be9e61e1-45b9-42e3-899f-495a710537fc\") " Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.662617 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-utilities" (OuterVolumeSpecName: "utilities") pod "be9e61e1-45b9-42e3-899f-495a710537fc" (UID: "be9e61e1-45b9-42e3-899f-495a710537fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.665782 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-utilities" (OuterVolumeSpecName: "utilities") pod "1ca57952-a8b4-45bc-bf5a-1ddd025835c9" (UID: "1ca57952-a8b4-45bc-bf5a-1ddd025835c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.667038 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-kube-api-access-7zj65" (OuterVolumeSpecName: "kube-api-access-7zj65") pod "1ca57952-a8b4-45bc-bf5a-1ddd025835c9" (UID: "1ca57952-a8b4-45bc-bf5a-1ddd025835c9"). InnerVolumeSpecName "kube-api-access-7zj65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.667177 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9e61e1-45b9-42e3-899f-495a710537fc-kube-api-access-5c2rc" (OuterVolumeSpecName: "kube-api-access-5c2rc") pod "be9e61e1-45b9-42e3-899f-495a710537fc" (UID: "be9e61e1-45b9-42e3-899f-495a710537fc"). InnerVolumeSpecName "kube-api-access-5c2rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.719836 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be9e61e1-45b9-42e3-899f-495a710537fc" (UID: "be9e61e1-45b9-42e3-899f-495a710537fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.734635 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ca57952-a8b4-45bc-bf5a-1ddd025835c9" (UID: "1ca57952-a8b4-45bc-bf5a-1ddd025835c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.762968 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.763022 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c2rc\" (UniqueName: \"kubernetes.io/projected/be9e61e1-45b9-42e3-899f-495a710537fc-kube-api-access-5c2rc\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.763042 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9e61e1-45b9-42e3-899f-495a710537fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.763059 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.763078 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zj65\" (UniqueName: \"kubernetes.io/projected/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-kube-api-access-7zj65\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4786]: I0313 11:51:39.763152 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca57952-a8b4-45bc-bf5a-1ddd025835c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.326125 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xftvt"] Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.326496 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xftvt" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="registry-server" containerID="cri-o://72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e" gracePeriod=2 Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.376618 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4982" event={"ID":"be9e61e1-45b9-42e3-899f-495a710537fc","Type":"ContainerDied","Data":"2de0f548e7130f2b0b682fe631a01388002b8ca99858012c803865213bebd447"} Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.376670 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4982" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.376716 4786 scope.go:117] "RemoveContainer" containerID="f8ebe6486480758e4ce1344261ea9c1c54d72f291585c2816bb2bf84aa5d970e" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.381756 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"916375dbd60d5646bdf04f7b3ff54e5cebfbc453558600eb317d36e2a6093d7a"} Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.385078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bkf6" event={"ID":"1ca57952-a8b4-45bc-bf5a-1ddd025835c9","Type":"ContainerDied","Data":"5fd76ee01716ca20fd9dc31bed28c2e80f3cafbb12f968481dc98f7675f73650"} Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.385305 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bkf6" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.449607 4786 scope.go:117] "RemoveContainer" containerID="b607280f98744618d5fa4830e7cb1e122be742efc698c6e0b278ac1ffb95d2d9" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.468590 4786 scope.go:117] "RemoveContainer" containerID="4c3959c467d5b50937d21f390d8b593ce05082dc9423208c9f1fffcd068b3b33" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.480173 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4982"] Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.489552 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f4982"] Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.492466 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9bkf6"] Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.495143 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9bkf6"] Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.499117 4786 scope.go:117] "RemoveContainer" containerID="13ec4a5cfd6d6cf13ec57f36715552e840a0b9c7235367b6cbf28a0526f67a53" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.525786 4786 scope.go:117] "RemoveContainer" containerID="f54b62a936ddb568297e742a5873b389c9eb8d8e8ad94efcb9bfb62808a55236" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.541403 4786 scope.go:117] "RemoveContainer" containerID="a44f34f006e0d256df37e66b8eef7e0be9929451119df2928a03a9bf629c8fb7" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.782832 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.978055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-catalog-content\") pod \"a6b7548d-0202-4690-b267-90076b5e4687\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.978224 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-utilities\") pod \"a6b7548d-0202-4690-b267-90076b5e4687\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.978408 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gx7v\" (UniqueName: \"kubernetes.io/projected/a6b7548d-0202-4690-b267-90076b5e4687-kube-api-access-8gx7v\") pod \"a6b7548d-0202-4690-b267-90076b5e4687\" (UID: \"a6b7548d-0202-4690-b267-90076b5e4687\") " Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.979529 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-utilities" (OuterVolumeSpecName: "utilities") pod "a6b7548d-0202-4690-b267-90076b5e4687" (UID: "a6b7548d-0202-4690-b267-90076b5e4687"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:40 crc kubenswrapper[4786]: I0313 11:51:40.983377 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b7548d-0202-4690-b267-90076b5e4687-kube-api-access-8gx7v" (OuterVolumeSpecName: "kube-api-access-8gx7v") pod "a6b7548d-0202-4690-b267-90076b5e4687" (UID: "a6b7548d-0202-4690-b267-90076b5e4687"). InnerVolumeSpecName "kube-api-access-8gx7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.080186 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gx7v\" (UniqueName: \"kubernetes.io/projected/a6b7548d-0202-4690-b267-90076b5e4687-kube-api-access-8gx7v\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.080534 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.127133 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6b7548d-0202-4690-b267-90076b5e4687" (UID: "a6b7548d-0202-4690-b267-90076b5e4687"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.181948 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b7548d-0202-4690-b267-90076b5e4687-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.397553 4786 generic.go:334] "Generic (PLEG): container finished" podID="a6b7548d-0202-4690-b267-90076b5e4687" containerID="72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e" exitCode=0 Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.397633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xftvt" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.397673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerDied","Data":"72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e"} Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.397752 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xftvt" event={"ID":"a6b7548d-0202-4690-b267-90076b5e4687","Type":"ContainerDied","Data":"276c25a245a1f429b1f571ad21eb54e2b9bae336e06c670ded731d3a8286378a"} Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.397775 4786 scope.go:117] "RemoveContainer" containerID="72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.418513 4786 scope.go:117] "RemoveContainer" containerID="105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.431538 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xftvt"] Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.437189 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xftvt"] Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.448199 4786 scope.go:117] "RemoveContainer" containerID="566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.449263 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" path="/var/lib/kubelet/pods/1ca57952-a8b4-45bc-bf5a-1ddd025835c9/volumes" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.450355 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b7548d-0202-4690-b267-90076b5e4687" path="/var/lib/kubelet/pods/a6b7548d-0202-4690-b267-90076b5e4687/volumes" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.451212 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" path="/var/lib/kubelet/pods/be9e61e1-45b9-42e3-899f-495a710537fc/volumes" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.482121 4786 scope.go:117] "RemoveContainer" containerID="72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e" Mar 13 11:51:41 crc kubenswrapper[4786]: E0313 11:51:41.486313 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e\": container with ID starting with 72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e not found: ID does not exist" containerID="72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.486378 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e"} err="failed to get container status \"72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e\": rpc error: code = NotFound desc = could not find container \"72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e\": container with ID starting with 72f4a1a25e37e85408939ce19c1f635bf935b985d2d805811c59cf4d9ae8c48e not found: ID does not exist" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.486413 4786 scope.go:117] "RemoveContainer" containerID="105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb" Mar 13 11:51:41 crc kubenswrapper[4786]: E0313 11:51:41.486808 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb\": container with ID starting with 105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb not found: ID does not exist" containerID="105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.486858 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb"} err="failed to get container status \"105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb\": rpc error: code = NotFound desc = could not find container \"105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb\": container with ID starting with 105455fac9eee2699a8a11bb96ed32def66b930297f7ca39aa05f4a617d0abdb not found: ID does not exist" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.486909 4786 scope.go:117] "RemoveContainer" containerID="566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde" Mar 13 11:51:41 crc kubenswrapper[4786]: E0313 11:51:41.487441 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde\": container with ID starting with 566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde not found: ID does not exist" containerID="566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.487501 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde"} err="failed to get container status \"566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde\": rpc error: code = NotFound desc = could not find container \"566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde\": container with ID starting with 566ced0d9be9c00546b53d0feee5a016b9a50fe5767f5fa83d1f4cf8e3c85fde not found: ID does not exist" Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.691056 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74d94c9989-78pl2"] Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.691290 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" podUID="29fd8c54-fe62-4644-9cce-ec5bda577457" containerName="controller-manager" containerID="cri-o://8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e" gracePeriod=30 Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.718546 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz"] Mar 13 11:51:41 crc kubenswrapper[4786]: I0313 11:51:41.718855 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" podUID="d3f78cf2-9cd7-4574-9f50-2053cab39a5d" containerName="route-controller-manager" containerID="cri-o://96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7" gracePeriod=30 Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.207482 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.249924 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.396799 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5s24\" (UniqueName: \"kubernetes.io/projected/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-kube-api-access-n5s24\") pod \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.396921 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-client-ca\") pod \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398039 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3f78cf2-9cd7-4574-9f50-2053cab39a5d" (UID: "d3f78cf2-9cd7-4574-9f50-2053cab39a5d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398294 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rsk5\" (UniqueName: \"kubernetes.io/projected/29fd8c54-fe62-4644-9cce-ec5bda577457-kube-api-access-5rsk5\") pod \"29fd8c54-fe62-4644-9cce-ec5bda577457\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398331 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-config\") pod \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398368 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fd8c54-fe62-4644-9cce-ec5bda577457-serving-cert\") pod \"29fd8c54-fe62-4644-9cce-ec5bda577457\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398399 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-serving-cert\") pod \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\" (UID: \"d3f78cf2-9cd7-4574-9f50-2053cab39a5d\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398484 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-config\") pod \"29fd8c54-fe62-4644-9cce-ec5bda577457\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398574 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-client-ca\") pod \"29fd8c54-fe62-4644-9cce-ec5bda577457\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.398687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-proxy-ca-bundles\") pod \"29fd8c54-fe62-4644-9cce-ec5bda577457\" (UID: \"29fd8c54-fe62-4644-9cce-ec5bda577457\") " Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.399582 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.399632 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "29fd8c54-fe62-4644-9cce-ec5bda577457" (UID: "29fd8c54-fe62-4644-9cce-ec5bda577457"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.400177 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-client-ca" (OuterVolumeSpecName: "client-ca") pod "29fd8c54-fe62-4644-9cce-ec5bda577457" (UID: "29fd8c54-fe62-4644-9cce-ec5bda577457"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.400502 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-config" (OuterVolumeSpecName: "config") pod "29fd8c54-fe62-4644-9cce-ec5bda577457" (UID: "29fd8c54-fe62-4644-9cce-ec5bda577457"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.400543 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-config" (OuterVolumeSpecName: "config") pod "d3f78cf2-9cd7-4574-9f50-2053cab39a5d" (UID: "d3f78cf2-9cd7-4574-9f50-2053cab39a5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.403106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3f78cf2-9cd7-4574-9f50-2053cab39a5d" (UID: "d3f78cf2-9cd7-4574-9f50-2053cab39a5d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.403254 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29fd8c54-fe62-4644-9cce-ec5bda577457-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "29fd8c54-fe62-4644-9cce-ec5bda577457" (UID: "29fd8c54-fe62-4644-9cce-ec5bda577457"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404090 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29fd8c54-fe62-4644-9cce-ec5bda577457-kube-api-access-5rsk5" (OuterVolumeSpecName: "kube-api-access-5rsk5") pod "29fd8c54-fe62-4644-9cce-ec5bda577457" (UID: "29fd8c54-fe62-4644-9cce-ec5bda577457"). InnerVolumeSpecName "kube-api-access-5rsk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404152 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-kube-api-access-n5s24" (OuterVolumeSpecName: "kube-api-access-n5s24") pod "d3f78cf2-9cd7-4574-9f50-2053cab39a5d" (UID: "d3f78cf2-9cd7-4574-9f50-2053cab39a5d"). InnerVolumeSpecName "kube-api-access-n5s24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404618 4786 generic.go:334] "Generic (PLEG): container finished" podID="29fd8c54-fe62-4644-9cce-ec5bda577457" containerID="8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e" exitCode=0 Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" event={"ID":"29fd8c54-fe62-4644-9cce-ec5bda577457","Type":"ContainerDied","Data":"8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e"} Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404736 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" event={"ID":"29fd8c54-fe62-4644-9cce-ec5bda577457","Type":"ContainerDied","Data":"aa1a93847f8a9d931e1e72b6de4023803dcfea3750958395980ce0420da99294"} Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404763 4786 scope.go:117] "RemoveContainer" containerID="8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.404898 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74d94c9989-78pl2" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.412938 4786 generic.go:334] "Generic (PLEG): container finished" podID="d3f78cf2-9cd7-4574-9f50-2053cab39a5d" containerID="96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7" exitCode=0 Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.413176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" event={"ID":"d3f78cf2-9cd7-4574-9f50-2053cab39a5d","Type":"ContainerDied","Data":"96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7"} Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.413245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" event={"ID":"d3f78cf2-9cd7-4574-9f50-2053cab39a5d","Type":"ContainerDied","Data":"f495a56e98492d16ce68dbf7fcf6cd7d36ec143f7b192f50e683ee70efa596da"} Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.413537 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.425268 4786 scope.go:117] "RemoveContainer" containerID="8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.426991 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e\": container with ID starting with 8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e not found: ID does not exist" containerID="8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.427042 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e"} err="failed to get container status \"8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e\": rpc error: code = NotFound desc = could not find container \"8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e\": container with ID starting with 8afb861d911bf67b98799bd2d1f9d305f65f04177d45e34cdb83a8a2dc06638e not found: ID does not exist" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.427072 4786 scope.go:117] "RemoveContainer" containerID="96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.451060 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74d94c9989-78pl2"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.451528 4786 scope.go:117] "RemoveContainer" containerID="96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.452136 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7\": container with ID starting with 96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7 not found: ID does not exist" containerID="96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.452224 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7"} err="failed to get container status \"96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7\": rpc error: code = NotFound desc = could not find container \"96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7\": container with ID starting with 96e1dcf96eb251367c368e8fd924ce0107f6b85509ba41749ffdedf4a3891ee7 not found: ID does not exist" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.453518 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74d94c9989-78pl2"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.488419 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.496160 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bb897b76-6jqdz"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500759 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5s24\" (UniqueName: \"kubernetes.io/projected/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-kube-api-access-n5s24\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500807 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rsk5\" (UniqueName: \"kubernetes.io/projected/29fd8c54-fe62-4644-9cce-ec5bda577457-kube-api-access-5rsk5\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500835 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500857 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f78cf2-9cd7-4574-9f50-2053cab39a5d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500876 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fd8c54-fe62-4644-9cce-ec5bda577457-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500924 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500947 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.500970 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fd8c54-fe62-4644-9cce-ec5bda577457-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.805662 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw"] Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807089 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="extract-content" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807119 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="extract-content" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807154 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f78cf2-9cd7-4574-9f50-2053cab39a5d" containerName="route-controller-manager" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807174 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f78cf2-9cd7-4574-9f50-2053cab39a5d" containerName="route-controller-manager" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807211 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807229 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807257 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="extract-utilities" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807276 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="extract-utilities" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807298 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="extract-content" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807314 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="extract-content" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807336 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29fd8c54-fe62-4644-9cce-ec5bda577457" containerName="controller-manager" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807349 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fd8c54-fe62-4644-9cce-ec5bda577457" containerName="controller-manager" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807366 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807378 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807397 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="extract-utilities" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807409 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="extract-utilities" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807423 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="extract-content" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807435 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="extract-content" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807451 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807462 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: E0313 11:51:42.807478 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="extract-utilities" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807489 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="extract-utilities" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.807655 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f78cf2-9cd7-4574-9f50-2053cab39a5d" containerName="route-controller-manager" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.808136 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="29fd8c54-fe62-4644-9cce-ec5bda577457" containerName="controller-manager" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.808160 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b7548d-0202-4690-b267-90076b5e4687" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.808180 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca57952-a8b4-45bc-bf5a-1ddd025835c9" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.808196 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9e61e1-45b9-42e3-899f-495a710537fc" containerName="registry-server" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.808776 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.811974 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.814558 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.815394 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.816513 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.816786 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.816960 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.817229 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.817355 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.821716 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.821840 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.822376 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.822692 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.822735 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.822850 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.823132 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.826606 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.831592 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747"] Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.905341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-proxy-ca-bundles\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.905603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed858c-3eb2-46d7-8d18-48aabb59bd33-serving-cert\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.905775 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-config\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.905902 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-client-ca\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.906022 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-config\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.906133 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z8v8\" (UniqueName: \"kubernetes.io/projected/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-kube-api-access-6z8v8\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.906227 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6n9d\" (UniqueName: \"kubernetes.io/projected/57ed858c-3eb2-46d7-8d18-48aabb59bd33-kube-api-access-l6n9d\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.906334 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-client-ca\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:42 crc kubenswrapper[4786]: I0313 11:51:42.906418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-serving-cert\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007372 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-serving-cert\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-proxy-ca-bundles\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007507 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed858c-3eb2-46d7-8d18-48aabb59bd33-serving-cert\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-config\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007580 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-client-ca\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007613 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-config\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z8v8\" (UniqueName: \"kubernetes.io/projected/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-kube-api-access-6z8v8\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6n9d\" (UniqueName: \"kubernetes.io/projected/57ed858c-3eb2-46d7-8d18-48aabb59bd33-kube-api-access-l6n9d\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.007694 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-client-ca\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.008574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-client-ca\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.010566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-client-ca\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.010704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-config\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.012425 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-config\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.014274 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-proxy-ca-bundles\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.017773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-serving-cert\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.020641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed858c-3eb2-46d7-8d18-48aabb59bd33-serving-cert\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.038031 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6n9d\" (UniqueName: \"kubernetes.io/projected/57ed858c-3eb2-46d7-8d18-48aabb59bd33-kube-api-access-l6n9d\") pod \"controller-manager-74cbc87f6f-b9qdw\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.038610 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z8v8\" (UniqueName: \"kubernetes.io/projected/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-kube-api-access-6z8v8\") pod \"route-controller-manager-7d7d78db84-ln747\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.139296 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.156126 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.324037 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw"] Mar 13 11:51:43 crc kubenswrapper[4786]: W0313 11:51:43.331862 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ed858c_3eb2_46d7_8d18_48aabb59bd33.slice/crio-c73148846cde07d489b5ebce357a900ed96fd3b862795f90b0ac8c3fd2d7f0a4 WatchSource:0}: Error finding container c73148846cde07d489b5ebce357a900ed96fd3b862795f90b0ac8c3fd2d7f0a4: Status 404 returned error can't find the container with id c73148846cde07d489b5ebce357a900ed96fd3b862795f90b0ac8c3fd2d7f0a4 Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.426759 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" event={"ID":"57ed858c-3eb2-46d7-8d18-48aabb59bd33","Type":"ContainerStarted","Data":"c73148846cde07d489b5ebce357a900ed96fd3b862795f90b0ac8c3fd2d7f0a4"} Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.461190 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29fd8c54-fe62-4644-9cce-ec5bda577457" path="/var/lib/kubelet/pods/29fd8c54-fe62-4644-9cce-ec5bda577457/volumes" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.462641 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f78cf2-9cd7-4574-9f50-2053cab39a5d" path="/var/lib/kubelet/pods/d3f78cf2-9cd7-4574-9f50-2053cab39a5d/volumes" Mar 13 11:51:43 crc kubenswrapper[4786]: I0313 11:51:43.588627 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747"] Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.438360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" event={"ID":"57ed858c-3eb2-46d7-8d18-48aabb59bd33","Type":"ContainerStarted","Data":"e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f"} Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.438755 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.443261 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.446601 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" event={"ID":"23a58d1a-70c6-4d60-a2f5-ef3762811a0b","Type":"ContainerStarted","Data":"2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45"} Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.446633 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" event={"ID":"23a58d1a-70c6-4d60-a2f5-ef3762811a0b","Type":"ContainerStarted","Data":"18c377a0178121a209ae35b904b90bf0798cd90359cfd7fc18b00f3e0d86efa0"} Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.446987 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.456637 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:51:44 crc kubenswrapper[4786]: I0313 11:51:44.461631 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" podStartSLOduration=3.4616093709999998 podStartE2EDuration="3.461609371s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:44.456926794 +0000 UTC m=+291.736580281" watchObservedRunningTime="2026-03-13 11:51:44.461609371 +0000 UTC m=+291.741262818" Mar 13 11:51:45 crc kubenswrapper[4786]: I0313 11:51:45.726592 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" podStartSLOduration=4.726573575 podStartE2EDuration="4.726573575s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:44.508720346 +0000 UTC m=+291.788373793" watchObservedRunningTime="2026-03-13 11:51:45.726573575 +0000 UTC m=+293.006227022" Mar 13 11:51:45 crc kubenswrapper[4786]: I0313 11:51:45.729480 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7slx"] Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.133567 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556712-wsvjm"] Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.135251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.138801 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.139185 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.141007 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.145158 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-wsvjm"] Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.301576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4sgb\" (UniqueName: \"kubernetes.io/projected/0fccd37e-4dd9-43b1-9553-507adcc48841-kube-api-access-s4sgb\") pod \"auto-csr-approver-29556712-wsvjm\" (UID: \"0fccd37e-4dd9-43b1-9553-507adcc48841\") " pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.403018 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4sgb\" (UniqueName: \"kubernetes.io/projected/0fccd37e-4dd9-43b1-9553-507adcc48841-kube-api-access-s4sgb\") pod \"auto-csr-approver-29556712-wsvjm\" (UID: \"0fccd37e-4dd9-43b1-9553-507adcc48841\") " pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.424430 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4sgb\" (UniqueName: \"kubernetes.io/projected/0fccd37e-4dd9-43b1-9553-507adcc48841-kube-api-access-s4sgb\") pod \"auto-csr-approver-29556712-wsvjm\" (UID: \"0fccd37e-4dd9-43b1-9553-507adcc48841\") " pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.452660 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:00 crc kubenswrapper[4786]: I0313 11:52:00.905467 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-wsvjm"] Mar 13 11:52:01 crc kubenswrapper[4786]: I0313 11:52:01.553296 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" event={"ID":"0fccd37e-4dd9-43b1-9553-507adcc48841","Type":"ContainerStarted","Data":"977e43f46607cbb0de84febb69f36ddb89e51d452e2a4afaada4b098f67ed872"} Mar 13 11:52:01 crc kubenswrapper[4786]: I0313 11:52:01.704077 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw"] Mar 13 11:52:01 crc kubenswrapper[4786]: I0313 11:52:01.704327 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" podUID="57ed858c-3eb2-46d7-8d18-48aabb59bd33" containerName="controller-manager" containerID="cri-o://e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f" gracePeriod=30 Mar 13 11:52:01 crc kubenswrapper[4786]: I0313 11:52:01.799781 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747"] Mar 13 11:52:01 crc kubenswrapper[4786]: I0313 11:52:01.799994 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" podUID="23a58d1a-70c6-4d60-a2f5-ef3762811a0b" containerName="route-controller-manager" containerID="cri-o://2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45" gracePeriod=30 Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.311400 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.326669 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-proxy-ca-bundles\") pod \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z8v8\" (UniqueName: \"kubernetes.io/projected/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-kube-api-access-6z8v8\") pod \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444432 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-config\") pod \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444459 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed858c-3eb2-46d7-8d18-48aabb59bd33-serving-cert\") pod \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444481 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-client-ca\") pod \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-client-ca\") pod \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-serving-cert\") pod \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444601 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-config\") pod \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\" (UID: \"23a58d1a-70c6-4d60-a2f5-ef3762811a0b\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.444629 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6n9d\" (UniqueName: \"kubernetes.io/projected/57ed858c-3eb2-46d7-8d18-48aabb59bd33-kube-api-access-l6n9d\") pod \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\" (UID: \"57ed858c-3eb2-46d7-8d18-48aabb59bd33\") " Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.445201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-client-ca" (OuterVolumeSpecName: "client-ca") pod "23a58d1a-70c6-4d60-a2f5-ef3762811a0b" (UID: "23a58d1a-70c6-4d60-a2f5-ef3762811a0b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.445503 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-config" (OuterVolumeSpecName: "config") pod "57ed858c-3eb2-46d7-8d18-48aabb59bd33" (UID: "57ed858c-3eb2-46d7-8d18-48aabb59bd33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.445524 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "57ed858c-3eb2-46d7-8d18-48aabb59bd33" (UID: "57ed858c-3eb2-46d7-8d18-48aabb59bd33"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.446068 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-config" (OuterVolumeSpecName: "config") pod "23a58d1a-70c6-4d60-a2f5-ef3762811a0b" (UID: "23a58d1a-70c6-4d60-a2f5-ef3762811a0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.446195 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-client-ca" (OuterVolumeSpecName: "client-ca") pod "57ed858c-3eb2-46d7-8d18-48aabb59bd33" (UID: "57ed858c-3eb2-46d7-8d18-48aabb59bd33"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.450409 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ed858c-3eb2-46d7-8d18-48aabb59bd33-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "57ed858c-3eb2-46d7-8d18-48aabb59bd33" (UID: "57ed858c-3eb2-46d7-8d18-48aabb59bd33"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.450447 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23a58d1a-70c6-4d60-a2f5-ef3762811a0b" (UID: "23a58d1a-70c6-4d60-a2f5-ef3762811a0b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.450478 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-kube-api-access-6z8v8" (OuterVolumeSpecName: "kube-api-access-6z8v8") pod "23a58d1a-70c6-4d60-a2f5-ef3762811a0b" (UID: "23a58d1a-70c6-4d60-a2f5-ef3762811a0b"). InnerVolumeSpecName "kube-api-access-6z8v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.451157 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ed858c-3eb2-46d7-8d18-48aabb59bd33-kube-api-access-l6n9d" (OuterVolumeSpecName: "kube-api-access-l6n9d") pod "57ed858c-3eb2-46d7-8d18-48aabb59bd33" (UID: "57ed858c-3eb2-46d7-8d18-48aabb59bd33"). InnerVolumeSpecName "kube-api-access-l6n9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547186 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547235 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z8v8\" (UniqueName: \"kubernetes.io/projected/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-kube-api-access-6z8v8\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547257 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547276 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed858c-3eb2-46d7-8d18-48aabb59bd33-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547294 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547312 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57ed858c-3eb2-46d7-8d18-48aabb59bd33-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547331 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547349 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a58d1a-70c6-4d60-a2f5-ef3762811a0b-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.547367 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6n9d\" (UniqueName: \"kubernetes.io/projected/57ed858c-3eb2-46d7-8d18-48aabb59bd33-kube-api-access-l6n9d\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.559805 4786 generic.go:334] "Generic (PLEG): container finished" podID="57ed858c-3eb2-46d7-8d18-48aabb59bd33" containerID="e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f" exitCode=0 Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.559859 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" event={"ID":"57ed858c-3eb2-46d7-8d18-48aabb59bd33","Type":"ContainerDied","Data":"e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f"} Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.559900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" event={"ID":"57ed858c-3eb2-46d7-8d18-48aabb59bd33","Type":"ContainerDied","Data":"c73148846cde07d489b5ebce357a900ed96fd3b862795f90b0ac8c3fd2d7f0a4"} Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.559920 4786 scope.go:117] "RemoveContainer" containerID="e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.560028 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.566555 4786 generic.go:334] "Generic (PLEG): container finished" podID="23a58d1a-70c6-4d60-a2f5-ef3762811a0b" containerID="2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45" exitCode=0 Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.566650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" event={"ID":"23a58d1a-70c6-4d60-a2f5-ef3762811a0b","Type":"ContainerDied","Data":"2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45"} Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.566689 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" event={"ID":"23a58d1a-70c6-4d60-a2f5-ef3762811a0b","Type":"ContainerDied","Data":"18c377a0178121a209ae35b904b90bf0798cd90359cfd7fc18b00f3e0d86efa0"} Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.566788 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.569140 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" event={"ID":"0fccd37e-4dd9-43b1-9553-507adcc48841","Type":"ContainerStarted","Data":"d624051a2ace35fa59b7c1dc6cf92f8ab0bd4c05854cfa4d29a57a0b17f95820"} Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.581517 4786 scope.go:117] "RemoveContainer" containerID="e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.594260 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" podStartSLOduration=1.311385772 podStartE2EDuration="2.59423815s" podCreationTimestamp="2026-03-13 11:52:00 +0000 UTC" firstStartedPulling="2026-03-13 11:52:00.924610406 +0000 UTC m=+308.204263853" lastFinishedPulling="2026-03-13 11:52:02.207462784 +0000 UTC m=+309.487116231" observedRunningTime="2026-03-13 11:52:02.593317564 +0000 UTC m=+309.872971021" watchObservedRunningTime="2026-03-13 11:52:02.59423815 +0000 UTC m=+309.873891607" Mar 13 11:52:02 crc kubenswrapper[4786]: E0313 11:52:02.596855 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f\": container with ID starting with e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f not found: ID does not exist" containerID="e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.596930 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f"} err="failed to get container status \"e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f\": rpc error: code = NotFound desc = could not find container \"e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f\": container with ID starting with e33e54cd73b2fd99481923bfb54f45ab5ccd978d0d982ba706f06ddd3996894f not found: ID does not exist" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.596967 4786 scope.go:117] "RemoveContainer" containerID="2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.616710 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw"] Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.621694 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74cbc87f6f-b9qdw"] Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.624452 4786 scope.go:117] "RemoveContainer" containerID="2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.624986 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747"] Mar 13 11:52:02 crc kubenswrapper[4786]: E0313 11:52:02.625365 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45\": container with ID starting with 2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45 not found: ID does not exist" containerID="2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.625448 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45"} err="failed to get container status \"2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45\": rpc error: code = NotFound desc = could not find container \"2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45\": container with ID starting with 2847b1d0515e48915c7afc9e133b9ca6b89fa1e39ed5c9602b49f7105ec4be45 not found: ID does not exist" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.628476 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d7d78db84-ln747"] Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.817803 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd"] Mar 13 11:52:02 crc kubenswrapper[4786]: E0313 11:52:02.818181 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a58d1a-70c6-4d60-a2f5-ef3762811a0b" containerName="route-controller-manager" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.818230 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a58d1a-70c6-4d60-a2f5-ef3762811a0b" containerName="route-controller-manager" Mar 13 11:52:02 crc kubenswrapper[4786]: E0313 11:52:02.818260 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ed858c-3eb2-46d7-8d18-48aabb59bd33" containerName="controller-manager" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.818280 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ed858c-3eb2-46d7-8d18-48aabb59bd33" containerName="controller-manager" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.818521 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ed858c-3eb2-46d7-8d18-48aabb59bd33" containerName="controller-manager" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.818554 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a58d1a-70c6-4d60-a2f5-ef3762811a0b" containerName="route-controller-manager" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.819278 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.823435 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.823800 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.824258 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.825095 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.825391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.833862 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd"] Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.838398 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.839924 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.952531 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gvf\" (UniqueName: \"kubernetes.io/projected/62320ceb-b6d8-4853-a1e5-c190ed104113-kube-api-access-p9gvf\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.952613 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-client-ca\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.952663 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-config\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.952990 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-proxy-ca-bundles\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:02 crc kubenswrapper[4786]: I0313 11:52:02.953057 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62320ceb-b6d8-4853-a1e5-c190ed104113-serving-cert\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.055005 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-proxy-ca-bundles\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.055079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62320ceb-b6d8-4853-a1e5-c190ed104113-serving-cert\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.055138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gvf\" (UniqueName: \"kubernetes.io/projected/62320ceb-b6d8-4853-a1e5-c190ed104113-kube-api-access-p9gvf\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.055187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-client-ca\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.055251 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-config\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.056831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-client-ca\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.057111 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-proxy-ca-bundles\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.057870 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62320ceb-b6d8-4853-a1e5-c190ed104113-config\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.061816 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62320ceb-b6d8-4853-a1e5-c190ed104113-serving-cert\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.086147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gvf\" (UniqueName: \"kubernetes.io/projected/62320ceb-b6d8-4853-a1e5-c190ed104113-kube-api-access-p9gvf\") pod \"controller-manager-75d4fb79c5-hn7sd\" (UID: \"62320ceb-b6d8-4853-a1e5-c190ed104113\") " pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.165356 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.407996 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd"] Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.450460 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a58d1a-70c6-4d60-a2f5-ef3762811a0b" path="/var/lib/kubelet/pods/23a58d1a-70c6-4d60-a2f5-ef3762811a0b/volumes" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.451944 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ed858c-3eb2-46d7-8d18-48aabb59bd33" path="/var/lib/kubelet/pods/57ed858c-3eb2-46d7-8d18-48aabb59bd33/volumes" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.578749 4786 generic.go:334] "Generic (PLEG): container finished" podID="0fccd37e-4dd9-43b1-9553-507adcc48841" containerID="d624051a2ace35fa59b7c1dc6cf92f8ab0bd4c05854cfa4d29a57a0b17f95820" exitCode=0 Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.578842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" event={"ID":"0fccd37e-4dd9-43b1-9553-507adcc48841","Type":"ContainerDied","Data":"d624051a2ace35fa59b7c1dc6cf92f8ab0bd4c05854cfa4d29a57a0b17f95820"} Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.583922 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" event={"ID":"62320ceb-b6d8-4853-a1e5-c190ed104113","Type":"ContainerStarted","Data":"859351d1780c25bfc21ef65ed3784bd01824363023271c58a31784fc553f5875"} Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.583959 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" event={"ID":"62320ceb-b6d8-4853-a1e5-c190ed104113","Type":"ContainerStarted","Data":"43205173fbc3da7157c939c63f0ffc8939fbc4299701b8422ab83aa718268ac9"} Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.584717 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.587662 4786 patch_prober.go:28] interesting pod/controller-manager-75d4fb79c5-hn7sd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.587714 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.621344 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" podStartSLOduration=2.621314847 podStartE2EDuration="2.621314847s" podCreationTimestamp="2026-03-13 11:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:03.618356557 +0000 UTC m=+310.898010034" watchObservedRunningTime="2026-03-13 11:52:03.621314847 +0000 UTC m=+310.900968324" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.815293 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t"] Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.815927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.820088 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.820161 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.821152 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.821238 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.821385 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.821537 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.825850 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t"] Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.855859 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.866021 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.866433 4786 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.866808 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b" gracePeriod=15 Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.867069 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa" gracePeriod=15 Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.867133 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586" gracePeriod=15 Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.867207 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302" gracePeriod=15 Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.867312 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6" gracePeriod=15 Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.870213 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871572 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871590 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871753 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871763 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871771 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871777 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871785 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871794 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871804 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871810 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871826 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871832 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871845 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871851 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.871864 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.871870 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872058 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872074 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872087 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872093 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872105 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872115 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872127 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872133 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872144 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.872283 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872291 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: E0313 11:52:03.872564 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.872574 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.968854 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-client-ca\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969148 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969177 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-serving-cert\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969241 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969316 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969358 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969391 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969428 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:03 crc kubenswrapper[4786]: I0313 11:52:03.969497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-config\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070434 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070496 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070544 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-config\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070567 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070618 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-client-ca\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070664 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070680 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070698 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-serving-cert\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070723 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070774 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070794 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.070855 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.071642 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-client-ca\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.071688 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.071710 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.071729 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.072046 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-config\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.072137 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.072460 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:04 crc kubenswrapper[4786]: E0313 11:52:04.072901 4786 projected.go:194] Error preparing data for projected volume kube-api-access-qkzq6 for pod openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:04 crc kubenswrapper[4786]: E0313 11:52:04.072963 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6 podName:0aa29cab-8be4-4999-aea4-c8c84acb6f5f nodeName:}" failed. No retries permitted until 2026-03-13 11:52:04.572944477 +0000 UTC m=+311.852597944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qkzq6" (UniqueName: "kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6") pod "route-controller-manager-6c7595bd78-q7j4t" (UID: "0aa29cab-8be4-4999-aea4-c8c84acb6f5f") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:04 crc kubenswrapper[4786]: E0313 11:52:04.073285 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-6c7595bd78-q7j4t.189c6461b413d981 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-6c7595bd78-q7j4t,UID:0aa29cab-8be4-4999-aea4-c8c84acb6f5f,APIVersion:v1,ResourceVersion:29948,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-qkzq6\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token\": dial tcp 38.102.83.151:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:52:04.072937857 +0000 UTC m=+311.352591314,LastTimestamp:2026-03-13 11:52:04.072937857 +0000 UTC m=+311.352591314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.076565 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-serving-cert\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.577993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:04 crc kubenswrapper[4786]: E0313 11:52:04.578597 4786 projected.go:194] Error preparing data for projected volume kube-api-access-qkzq6 for pod openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:04 crc kubenswrapper[4786]: E0313 11:52:04.578660 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6 podName:0aa29cab-8be4-4999-aea4-c8c84acb6f5f nodeName:}" failed. No retries permitted until 2026-03-13 11:52:05.578640708 +0000 UTC m=+312.858294155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkzq6" (UniqueName: "kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6") pod "route-controller-manager-6c7595bd78-q7j4t" (UID: "0aa29cab-8be4-4999-aea4-c8c84acb6f5f") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.590029 4786 generic.go:334] "Generic (PLEG): container finished" podID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" containerID="2b9922be607248f1c765ddd2032e8fe62e1d0161372782345f9f6f14eee36643" exitCode=0 Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.590099 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62","Type":"ContainerDied","Data":"2b9922be607248f1c765ddd2032e8fe62e1d0161372782345f9f6f14eee36643"} Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.590654 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.590842 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.592689 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.593548 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.594019 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa" exitCode=0 Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.594038 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586" exitCode=0 Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.594047 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6" exitCode=0 Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.594057 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302" exitCode=2 Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.594823 4786 scope.go:117] "RemoveContainer" containerID="ae58e17e1e0a742d6aa0465fd6632b1fadcd3e9bcecb5aa58ef959e78f543a69" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.599977 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.600287 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.600435 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.600576 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.894788 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.895536 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.895846 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.896331 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.896703 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.982991 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4sgb\" (UniqueName: \"kubernetes.io/projected/0fccd37e-4dd9-43b1-9553-507adcc48841-kube-api-access-s4sgb\") pod \"0fccd37e-4dd9-43b1-9553-507adcc48841\" (UID: \"0fccd37e-4dd9-43b1-9553-507adcc48841\") " Mar 13 11:52:04 crc kubenswrapper[4786]: I0313 11:52:04.988140 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fccd37e-4dd9-43b1-9553-507adcc48841-kube-api-access-s4sgb" (OuterVolumeSpecName: "kube-api-access-s4sgb") pod "0fccd37e-4dd9-43b1-9553-507adcc48841" (UID: "0fccd37e-4dd9-43b1-9553-507adcc48841"). InnerVolumeSpecName "kube-api-access-s4sgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.084308 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4sgb\" (UniqueName: \"kubernetes.io/projected/0fccd37e-4dd9-43b1-9553-507adcc48841-kube-api-access-s4sgb\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:05 crc kubenswrapper[4786]: E0313 11:52:05.514105 4786 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" volumeName="registry-storage" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.594021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:05 crc kubenswrapper[4786]: E0313 11:52:05.594699 4786 projected.go:194] Error preparing data for projected volume kube-api-access-qkzq6 for pod openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:05 crc kubenswrapper[4786]: E0313 11:52:05.594780 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6 podName:0aa29cab-8be4-4999-aea4-c8c84acb6f5f nodeName:}" failed. No retries permitted until 2026-03-13 11:52:07.59475697 +0000 UTC m=+314.874410427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkzq6" (UniqueName: "kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6") pod "route-controller-manager-6c7595bd78-q7j4t" (UID: "0aa29cab-8be4-4999-aea4-c8c84acb6f5f") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.603699 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" event={"ID":"0fccd37e-4dd9-43b1-9553-507adcc48841","Type":"ContainerDied","Data":"977e43f46607cbb0de84febb69f36ddb89e51d452e2a4afaada4b098f67ed872"} Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.603745 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.603758 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977e43f46607cbb0de84febb69f36ddb89e51d452e2a4afaada4b098f67ed872" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.608123 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.610498 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.610737 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.611146 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.947319 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.948106 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.948308 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:05 crc kubenswrapper[4786]: I0313 11:52:05.948467 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100316 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kube-api-access\") pod \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100396 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kubelet-dir\") pod \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100442 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-var-lock\") pod \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\" (UID: \"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62\") " Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100546 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" (UID: "5dd1b43f-299a-49c8-a8c3-d684c0ee2c62"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100615 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-var-lock" (OuterVolumeSpecName: "var-lock") pod "5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" (UID: "5dd1b43f-299a-49c8-a8c3-d684c0ee2c62"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100847 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.100866 4786 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.104025 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" (UID: "5dd1b43f-299a-49c8-a8c3-d684c0ee2c62"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.202590 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dd1b43f-299a-49c8-a8c3-d684c0ee2c62-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.254167 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.254927 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.255402 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.255988 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.256540 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.256599 4786 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.257216 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.458506 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.619692 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5dd1b43f-299a-49c8-a8c3-d684c0ee2c62","Type":"ContainerDied","Data":"d332a85de0f5274a1eb6c3aafe0f79357d5ab8783df86a66fa4866033e4b4aa5"} Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.619748 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d332a85de0f5274a1eb6c3aafe0f79357d5ab8783df86a66fa4866033e4b4aa5" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.619858 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.627316 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.628760 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b" exitCode=0 Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.628849 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d63fba13b335d463adaccbbc61b8d56609349f139fd53767f0551d628583fbf" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.639078 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.640530 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.641415 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.641832 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.642213 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.642955 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.651768 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.652090 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.652411 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.652698 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.815204 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.815976 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.816009 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.816167 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.816455 4786 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.816468 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.816544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:06 crc kubenswrapper[4786]: E0313 11:52:06.859622 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.917636 4786 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:06 crc kubenswrapper[4786]: I0313 11:52:06.917670 4786 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.446904 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.628180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:07 crc kubenswrapper[4786]: E0313 11:52:07.629770 4786 projected.go:194] Error preparing data for projected volume kube-api-access-qkzq6 for pod openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:07 crc kubenswrapper[4786]: E0313 11:52:07.629901 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6 podName:0aa29cab-8be4-4999-aea4-c8c84acb6f5f nodeName:}" failed. No retries permitted until 2026-03-13 11:52:11.629855401 +0000 UTC m=+318.909508918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkzq6" (UniqueName: "kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6") pod "route-controller-manager-6c7595bd78-q7j4t" (UID: "0aa29cab-8be4-4999-aea4-c8c84acb6f5f") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.633570 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.634530 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.635227 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.635837 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.636446 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.637432 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.637967 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.638452 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: I0313 11:52:07.638826 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:07 crc kubenswrapper[4786]: E0313 11:52:07.661054 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Mar 13 11:52:08 crc kubenswrapper[4786]: E0313 11:52:08.901792 4786 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:08 crc kubenswrapper[4786]: I0313 11:52:08.902498 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:09 crc kubenswrapper[4786]: E0313 11:52:09.261873 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Mar 13 11:52:09 crc kubenswrapper[4786]: I0313 11:52:09.657130 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773"} Mar 13 11:52:09 crc kubenswrapper[4786]: I0313 11:52:09.657709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e836393a78ece6f84ccfdbadc46aba347ba2e9c657556062c654067067fe1dc4"} Mar 13 11:52:09 crc kubenswrapper[4786]: I0313 11:52:09.658573 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:09 crc kubenswrapper[4786]: E0313 11:52:09.658945 4786 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:09 crc kubenswrapper[4786]: I0313 11:52:09.659280 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:09 crc kubenswrapper[4786]: I0313 11:52:09.659658 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:10 crc kubenswrapper[4786]: I0313 11:52:10.779539 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" podUID="e090577d-dd68-4f18-b70a-836560c655ce" containerName="oauth-openshift" containerID="cri-o://ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e" gracePeriod=15 Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.328744 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.329640 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.330156 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.330611 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.331044 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.484750 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-router-certs\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.484812 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-audit-policies\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.484865 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4967\" (UniqueName: \"kubernetes.io/projected/e090577d-dd68-4f18-b70a-836560c655ce-kube-api-access-n4967\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.484949 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-session\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485004 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-serving-cert\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-idp-0-file-data\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485151 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-error\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485274 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-cliconfig\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485330 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-service-ca\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485385 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-ocp-branding-template\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-provider-selection\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485509 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-trusted-ca-bundle\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485558 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e090577d-dd68-4f18-b70a-836560c655ce-audit-dir\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.485616 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-login\") pod \"e090577d-dd68-4f18-b70a-836560c655ce\" (UID: \"e090577d-dd68-4f18-b70a-836560c655ce\") " Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.486512 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.486639 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.487611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.487654 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e090577d-dd68-4f18-b70a-836560c655ce-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.489011 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.492502 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.494282 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e090577d-dd68-4f18-b70a-836560c655ce-kube-api-access-n4967" (OuterVolumeSpecName: "kube-api-access-n4967") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "kube-api-access-n4967". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.496379 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.496741 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.497092 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.497513 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.497825 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.497871 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.498206 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e090577d-dd68-4f18-b70a-836560c655ce" (UID: "e090577d-dd68-4f18-b70a-836560c655ce"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.587710 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.587756 4786 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e090577d-dd68-4f18-b70a-836560c655ce-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.587772 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.587783 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.587903 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588020 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4967\" (UniqueName: \"kubernetes.io/projected/e090577d-dd68-4f18-b70a-836560c655ce-kube-api-access-n4967\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588058 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588074 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588090 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588104 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588118 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588132 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588146 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.588161 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e090577d-dd68-4f18-b70a-836560c655ce-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.674179 4786 generic.go:334] "Generic (PLEG): container finished" podID="e090577d-dd68-4f18-b70a-836560c655ce" containerID="ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e" exitCode=0 Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.674248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" event={"ID":"e090577d-dd68-4f18-b70a-836560c655ce","Type":"ContainerDied","Data":"ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e"} Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.674278 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.674307 4786 scope.go:117] "RemoveContainer" containerID="ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.674295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" event={"ID":"e090577d-dd68-4f18-b70a-836560c655ce","Type":"ContainerDied","Data":"19f45b55749131f484204d730da515a461ef11f9652dc68d43ba30df2e0d7a11"} Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.675155 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.675585 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.675920 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.676212 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.689081 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:11 crc kubenswrapper[4786]: E0313 11:52:11.689874 4786 projected.go:194] Error preparing data for projected volume kube-api-access-qkzq6 for pod openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:11 crc kubenswrapper[4786]: E0313 11:52:11.690010 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6 podName:0aa29cab-8be4-4999-aea4-c8c84acb6f5f nodeName:}" failed. No retries permitted until 2026-03-13 11:52:19.68998042 +0000 UTC m=+326.969633897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkzq6" (UniqueName: "kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6") pod "route-controller-manager-6c7595bd78-q7j4t" (UID: "0aa29cab-8be4-4999-aea4-c8c84acb6f5f") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.151:6443: connect: connection refused Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.703582 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.704598 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.705524 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.706747 4786 scope.go:117] "RemoveContainer" containerID="ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.706955 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:11 crc kubenswrapper[4786]: E0313 11:52:11.707323 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e\": container with ID starting with ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e not found: ID does not exist" containerID="ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e" Mar 13 11:52:11 crc kubenswrapper[4786]: I0313 11:52:11.707376 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e"} err="failed to get container status \"ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e\": rpc error: code = NotFound desc = could not find container \"ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e\": container with ID starting with ba2934ae4f71442d1e66bba9c3db62eb6f7018204da017006c1260e320f0254e not found: ID does not exist" Mar 13 11:52:12 crc kubenswrapper[4786]: E0313 11:52:12.463282 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="6.4s" Mar 13 11:52:13 crc kubenswrapper[4786]: E0313 11:52:13.449753 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-6c7595bd78-q7j4t.189c6461b413d981 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-6c7595bd78-q7j4t,UID:0aa29cab-8be4-4999-aea4-c8c84acb6f5f,APIVersion:v1,ResourceVersion:29948,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-qkzq6\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token\": dial tcp 38.102.83.151:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:52:04.072937857 +0000 UTC m=+311.352591314,LastTimestamp:2026-03-13 11:52:04.072937857 +0000 UTC m=+311.352591314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:52:13 crc kubenswrapper[4786]: I0313 11:52:13.451025 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:13 crc kubenswrapper[4786]: I0313 11:52:13.451573 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:13 crc kubenswrapper[4786]: I0313 11:52:13.451806 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:13 crc kubenswrapper[4786]: I0313 11:52:13.452069 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.439999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.441073 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.441511 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.441915 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.442457 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.461851 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.461915 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:16 crc kubenswrapper[4786]: E0313 11:52:16.462292 4786 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.462837 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:16 crc kubenswrapper[4786]: I0313 11:52:16.712186 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a13de238d1a52263b02cf3974ce51f0298b85a5055b966dedc45882d9efc5f6b"} Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.718954 4786 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="39bfd506117bfb163772918e5def47314f1705b1803eb581b247755a68100c48" exitCode=0 Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.719072 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"39bfd506117bfb163772918e5def47314f1705b1803eb581b247755a68100c48"} Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.719414 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.719456 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.720097 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: E0313 11:52:17.720207 4786 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.720775 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.721284 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.721911 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.725975 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.727235 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.727311 4786 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c" exitCode=1 Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.727421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c"} Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.728225 4786 scope.go:117] "RemoveContainer" containerID="28c1d98a0c7ba24ef602ef00ab698caa3da4c5f365fbf3ea24e273709c68299c" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.728351 4786 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.728936 4786 status_manager.go:851] "Failed to get status for pod" podUID="e090577d-dd68-4f18-b70a-836560c655ce" pod="openshift-authentication/oauth-openshift-558db77b4-p7slx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-p7slx\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.729449 4786 status_manager.go:851] "Failed to get status for pod" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" pod="openshift-infra/auto-csr-approver-29556712-wsvjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29556712-wsvjm\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.729955 4786 status_manager.go:851] "Failed to get status for pod" podUID="62320ceb-b6d8-4853-a1e5-c190ed104113" pod="openshift-controller-manager/controller-manager-75d4fb79c5-hn7sd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75d4fb79c5-hn7sd\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:17 crc kubenswrapper[4786]: I0313 11:52:17.730394 4786 status_manager.go:851] "Failed to get status for pod" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.087254 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.747690 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"14ae8dad3d48696328415b9b964d74e548e0f1084f24bb0bf36a230668020b84"} Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.748002 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e92e983f26fe45eded0b4a058578379551dc2a9cdfa914e197b01f7ec2db2f8b"} Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.748013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c94d57ef2081fb0cdbdfb81ede84d075a92cc04788188013d1675f6f00c3f9ea"} Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.748021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"09e96d8b0ae6f0321286a381c244a8d62bc7eaf9ff41e5021683c9bb6ece3259"} Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.756525 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.757437 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 11:52:18 crc kubenswrapper[4786]: I0313 11:52:18.757476 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c59afdf80405fb0065fa2f61937d7a7ead7bb9ddd42cced8a386ce88611e7673"} Mar 13 11:52:19 crc kubenswrapper[4786]: I0313 11:52:19.696912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:19 crc kubenswrapper[4786]: I0313 11:52:19.768659 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"96950d4f776811829fa0588a6615cf5f88d4718e170335f60a79ad9fe42eeec2"} Mar 13 11:52:19 crc kubenswrapper[4786]: I0313 11:52:19.768845 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:19 crc kubenswrapper[4786]: I0313 11:52:19.769095 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:19 crc kubenswrapper[4786]: I0313 11:52:19.769137 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:20 crc kubenswrapper[4786]: I0313 11:52:20.824527 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:52:20 crc kubenswrapper[4786]: I0313 11:52:20.830032 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:52:21 crc kubenswrapper[4786]: I0313 11:52:21.463020 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:21 crc kubenswrapper[4786]: I0313 11:52:21.463171 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:21 crc kubenswrapper[4786]: I0313 11:52:21.469096 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:21 crc kubenswrapper[4786]: I0313 11:52:21.780146 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.733136 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzq6\" (UniqueName: \"kubernetes.io/projected/0aa29cab-8be4-4999-aea4-c8c84acb6f5f-kube-api-access-qkzq6\") pod \"route-controller-manager-6c7595bd78-q7j4t\" (UID: \"0aa29cab-8be4-4999-aea4-c8c84acb6f5f\") " pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.778808 4786 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.796810 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.797080 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.802029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.804267 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5c945a6-c15b-457e-8a5d-28a0e2693c47" Mar 13 11:52:24 crc kubenswrapper[4786]: I0313 11:52:24.879518 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:25 crc kubenswrapper[4786]: W0313 11:52:25.301051 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa29cab_8be4_4999_aea4_c8c84acb6f5f.slice/crio-d192e36c57f43c79a41746d1355037d86b5ce9530c7245f62d66ba970bab6dd4 WatchSource:0}: Error finding container d192e36c57f43c79a41746d1355037d86b5ce9530c7245f62d66ba970bab6dd4: Status 404 returned error can't find the container with id d192e36c57f43c79a41746d1355037d86b5ce9530c7245f62d66ba970bab6dd4 Mar 13 11:52:25 crc kubenswrapper[4786]: I0313 11:52:25.803460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" event={"ID":"0aa29cab-8be4-4999-aea4-c8c84acb6f5f","Type":"ContainerStarted","Data":"63b0e87a6608ce5a87007394f527a09216bfa68dbce7ea05415bf7b1007e7a85"} Mar 13 11:52:25 crc kubenswrapper[4786]: I0313 11:52:25.803746 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" event={"ID":"0aa29cab-8be4-4999-aea4-c8c84acb6f5f","Type":"ContainerStarted","Data":"d192e36c57f43c79a41746d1355037d86b5ce9530c7245f62d66ba970bab6dd4"} Mar 13 11:52:25 crc kubenswrapper[4786]: I0313 11:52:25.803629 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:25 crc kubenswrapper[4786]: I0313 11:52:25.803778 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7db6fd3-49ce-4311-850e-dcb4e4db3a67" Mar 13 11:52:26 crc kubenswrapper[4786]: I0313 11:52:26.809997 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:27 crc kubenswrapper[4786]: I0313 11:52:27.810639 4786 patch_prober.go:28] interesting pod/route-controller-manager-6c7595bd78-q7j4t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:27 crc kubenswrapper[4786]: I0313 11:52:27.810844 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" podUID="0aa29cab-8be4-4999-aea4-c8c84acb6f5f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:28 crc kubenswrapper[4786]: I0313 11:52:28.093379 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:52:28 crc kubenswrapper[4786]: I0313 11:52:28.819214 4786 patch_prober.go:28] interesting pod/route-controller-manager-6c7595bd78-q7j4t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:28 crc kubenswrapper[4786]: I0313 11:52:28.819558 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" podUID="0aa29cab-8be4-4999-aea4-c8c84acb6f5f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.536353 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.537385 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.539656 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.539820 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.548041 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.563751 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.638640 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.638720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.641228 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.651590 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.665038 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.666124 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.773389 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.785203 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:52:29 crc kubenswrapper[4786]: I0313 11:52:29.798398 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:52:30 crc kubenswrapper[4786]: W0313 11:52:30.272530 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e314d914a6f72a4efda2e0216fc542fb8acd3b6c1526e34d9ea3e0e752eec9df WatchSource:0}: Error finding container e314d914a6f72a4efda2e0216fc542fb8acd3b6c1526e34d9ea3e0e752eec9df: Status 404 returned error can't find the container with id e314d914a6f72a4efda2e0216fc542fb8acd3b6c1526e34d9ea3e0e752eec9df Mar 13 11:52:30 crc kubenswrapper[4786]: W0313 11:52:30.366281 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-8447ec91d1b05ebe1a817f4bb555ed7e2322db99211e82d4ed8c24a8b8286747 WatchSource:0}: Error finding container 8447ec91d1b05ebe1a817f4bb555ed7e2322db99211e82d4ed8c24a8b8286747: Status 404 returned error can't find the container with id 8447ec91d1b05ebe1a817f4bb555ed7e2322db99211e82d4ed8c24a8b8286747 Mar 13 11:52:30 crc kubenswrapper[4786]: W0313 11:52:30.473031 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ab13e24b92bc6f6e0d6075213c25ae58aada92e13f50a6088aade2b6c29519e5 WatchSource:0}: Error finding container ab13e24b92bc6f6e0d6075213c25ae58aada92e13f50a6088aade2b6c29519e5: Status 404 returned error can't find the container with id ab13e24b92bc6f6e0d6075213c25ae58aada92e13f50a6088aade2b6c29519e5 Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.868589 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7b81ec5fae08d349bc709fb7495285407d042655ec2254b50a4b65722479c243"} Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.869403 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ab13e24b92bc6f6e0d6075213c25ae58aada92e13f50a6088aade2b6c29519e5"} Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.874034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c04fd277f39c170fc34610b96c72c86d4e70167ce56c9fe7f143b4dd265a89b4"} Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.874142 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e314d914a6f72a4efda2e0216fc542fb8acd3b6c1526e34d9ea3e0e752eec9df"} Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.874353 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.876763 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b82b83e1e32656a6dc3ace2b17adc4b0e8c7f59af876582d90d1b75c486fc272"} Mar 13 11:52:30 crc kubenswrapper[4786]: I0313 11:52:30.877038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8447ec91d1b05ebe1a817f4bb555ed7e2322db99211e82d4ed8c24a8b8286747"} Mar 13 11:52:32 crc kubenswrapper[4786]: I0313 11:52:32.893164 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 13 11:52:32 crc kubenswrapper[4786]: I0313 11:52:32.893276 4786 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="7b81ec5fae08d349bc709fb7495285407d042655ec2254b50a4b65722479c243" exitCode=255 Mar 13 11:52:32 crc kubenswrapper[4786]: I0313 11:52:32.893334 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"7b81ec5fae08d349bc709fb7495285407d042655ec2254b50a4b65722479c243"} Mar 13 11:52:32 crc kubenswrapper[4786]: I0313 11:52:32.893912 4786 scope.go:117] "RemoveContainer" containerID="7b81ec5fae08d349bc709fb7495285407d042655ec2254b50a4b65722479c243" Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.469364 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5c945a6-c15b-457e-8a5d-28a0e2693c47" Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.908391 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.909580 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.909738 4786 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="e0a1e18ee8aafe700e4f60ff68c55fe9c8250895937c59d24b1a877a73f5cec5" exitCode=255 Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.909863 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"e0a1e18ee8aafe700e4f60ff68c55fe9c8250895937c59d24b1a877a73f5cec5"} Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.909971 4786 scope.go:117] "RemoveContainer" containerID="7b81ec5fae08d349bc709fb7495285407d042655ec2254b50a4b65722479c243" Mar 13 11:52:33 crc kubenswrapper[4786]: I0313 11:52:33.910388 4786 scope.go:117] "RemoveContainer" containerID="e0a1e18ee8aafe700e4f60ff68c55fe9c8250895937c59d24b1a877a73f5cec5" Mar 13 11:52:33 crc kubenswrapper[4786]: E0313 11:52:33.910620 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:52:34 crc kubenswrapper[4786]: I0313 11:52:34.131785 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 11:52:34 crc kubenswrapper[4786]: I0313 11:52:34.534377 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 11:52:34 crc kubenswrapper[4786]: I0313 11:52:34.895577 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 11:52:34 crc kubenswrapper[4786]: I0313 11:52:34.920586 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 13 11:52:35 crc kubenswrapper[4786]: I0313 11:52:35.277209 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 11:52:35 crc kubenswrapper[4786]: I0313 11:52:35.881253 4786 patch_prober.go:28] interesting pod/route-controller-manager-6c7595bd78-q7j4t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:35 crc kubenswrapper[4786]: I0313 11:52:35.881632 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" podUID="0aa29cab-8be4-4999-aea4-c8c84acb6f5f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:35 crc kubenswrapper[4786]: I0313 11:52:35.892090 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.010468 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.157192 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.210120 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.230576 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.279672 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.345605 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.524257 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.529459 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.542645 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.673041 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.754267 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 11:52:36 crc kubenswrapper[4786]: I0313 11:52:36.991543 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 11:52:37 crc kubenswrapper[4786]: I0313 11:52:37.444788 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 11:52:37 crc kubenswrapper[4786]: I0313 11:52:37.455908 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 11:52:37 crc kubenswrapper[4786]: I0313 11:52:37.529783 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 11:52:37 crc kubenswrapper[4786]: I0313 11:52:37.878279 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 11:52:37 crc kubenswrapper[4786]: I0313 11:52:37.945405 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.111455 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.135510 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.190809 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.219865 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.242109 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.276357 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.392478 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.402329 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.467645 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.573415 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.608706 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.702743 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.754393 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.790826 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.824402 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.869148 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.872051 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.894137 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.913226 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.935213 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.995215 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 11:52:38 crc kubenswrapper[4786]: I0313 11:52:38.997573 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.181536 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.212034 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.221594 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.266003 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.269817 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.418189 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.509613 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.557726 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.571124 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.631485 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.705976 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.930326 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 11:52:39 crc kubenswrapper[4786]: I0313 11:52:39.937109 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.018753 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.092948 4786 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.209626 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.307634 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.400803 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.471292 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.590944 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.719015 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.860370 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.862553 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.863210 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.898563 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.927718 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.949156 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 11:52:40 crc kubenswrapper[4786]: I0313 11:52:40.960619 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.026362 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.050357 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.128596 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.140508 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.141283 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.295664 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.389048 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.484624 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.505108 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.636745 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.751560 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.768384 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.825099 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.882212 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 11:52:41 crc kubenswrapper[4786]: I0313 11:52:41.893805 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.015768 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.017530 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.103057 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.156436 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.156555 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.350121 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.364376 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.447942 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.596665 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.624157 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.649562 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.831118 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.972346 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.977733 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4786]: I0313 11:52:42.995061 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.009016 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.011646 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.028841 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.064162 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.074671 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.162061 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.276102 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.300986 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.339052 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.399035 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.408270 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.569326 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.620315 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.689101 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.703089 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.770103 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.826398 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.846762 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.918377 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.923965 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.944640 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 11:52:43 crc kubenswrapper[4786]: I0313 11:52:43.991671 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.162277 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.178706 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.195702 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.323436 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.344770 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.391596 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.586617 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.608719 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.640043 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.817406 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.841365 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.844397 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.845693 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.859238 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 11:52:44 crc kubenswrapper[4786]: I0313 11:52:44.929921 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.044090 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.071241 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.154424 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.155480 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.173929 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.232466 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.267761 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.270873 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.329490 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.368469 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.392465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.507178 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.553676 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.561856 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.584502 4786 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.632029 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.697737 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.698980 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.740969 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.755031 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.852369 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 11:52:45 crc kubenswrapper[4786]: I0313 11:52:45.922347 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.113322 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.206609 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.264414 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.324127 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.400937 4786 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.433174 4786 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.434709 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t" podStartSLOduration=45.434695134 podStartE2EDuration="45.434695134s" podCreationTimestamp="2026-03-13 11:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:25.822853506 +0000 UTC m=+333.102506973" watchObservedRunningTime="2026-03-13 11:52:46.434695134 +0000 UTC m=+353.714348591" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.436921 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7slx","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.436968 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.436984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7595bd78-q7j4t"] Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.443616 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.462372 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.462353705 podStartE2EDuration="22.462353705s" podCreationTimestamp="2026-03-13 11:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:46.460309779 +0000 UTC m=+353.739963246" watchObservedRunningTime="2026-03-13 11:52:46.462353705 +0000 UTC m=+353.742007152" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.580847 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.586995 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.605521 4786 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.674752 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.695238 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.696073 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.719085 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.738073 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.780107 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.881392 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 11:52:46 crc kubenswrapper[4786]: I0313 11:52:46.911263 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.001948 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.034067 4786 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.034314 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773" gracePeriod=5 Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.122141 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.157348 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.158351 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.158849 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.232608 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.235618 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.298002 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.319635 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.449816 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e090577d-dd68-4f18-b70a-836560c655ce" path="/var/lib/kubelet/pods/e090577d-dd68-4f18-b70a-836560c655ce/volumes" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.567726 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.595475 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.603592 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.607933 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.647854 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.737569 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.932085 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.933760 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 11:52:47 crc kubenswrapper[4786]: I0313 11:52:47.956285 4786 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.013060 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.047146 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079573 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7679bd97df-bkcg8"] Mar 13 11:52:48 crc kubenswrapper[4786]: E0313 11:52:48.079774 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079787 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:52:48 crc kubenswrapper[4786]: E0313 11:52:48.079804 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" containerName="installer" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079810 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" containerName="installer" Mar 13 11:52:48 crc kubenswrapper[4786]: E0313 11:52:48.079820 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e090577d-dd68-4f18-b70a-836560c655ce" containerName="oauth-openshift" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079826 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e090577d-dd68-4f18-b70a-836560c655ce" containerName="oauth-openshift" Mar 13 11:52:48 crc kubenswrapper[4786]: E0313 11:52:48.079834 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" containerName="oc" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079840 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" containerName="oc" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079967 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd1b43f-299a-49c8-a8c3-d684c0ee2c62" containerName="installer" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079981 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.079993 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" containerName="oc" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.080005 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e090577d-dd68-4f18-b70a-836560c655ce" containerName="oauth-openshift" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.080353 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.086750 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.086774 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.086780 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.086784 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.087429 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.087675 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.089085 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.089159 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.089185 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.089200 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.089389 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.089549 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.103778 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.105250 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.110032 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.111675 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7679bd97df-bkcg8"] Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177382 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-session\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177427 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177448 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-error\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177473 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-login\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177495 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177549 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-audit-policies\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177602 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177632 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85917a9e-4613-4dce-b269-3d620b5eeccd-audit-dir\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177659 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcqj\" (UniqueName: \"kubernetes.io/projected/85917a9e-4613-4dce-b269-3d620b5eeccd-kube-api-access-jvcqj\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177683 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-router-certs\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177700 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-service-ca\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.177719 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.197168 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.216162 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.222231 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.262563 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278344 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85917a9e-4613-4dce-b269-3d620b5eeccd-audit-dir\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278409 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcqj\" (UniqueName: \"kubernetes.io/projected/85917a9e-4613-4dce-b269-3d620b5eeccd-kube-api-access-jvcqj\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278433 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-router-certs\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278500 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85917a9e-4613-4dce-b269-3d620b5eeccd-audit-dir\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278551 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-service-ca\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278581 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278601 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-session\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-error\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-login\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.278922 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.279059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.279084 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.279211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-audit-policies\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.279234 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.279323 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-service-ca\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.280167 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.280725 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-audit-policies\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.281380 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.285553 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-session\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.285849 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.285661 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-error\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.285689 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-router-certs\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.287879 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.292254 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.296252 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-template-login\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.296664 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85917a9e-4613-4dce-b269-3d620b5eeccd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.304514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcqj\" (UniqueName: \"kubernetes.io/projected/85917a9e-4613-4dce-b269-3d620b5eeccd-kube-api-access-jvcqj\") pod \"oauth-openshift-7679bd97df-bkcg8\" (UID: \"85917a9e-4613-4dce-b269-3d620b5eeccd\") " pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.394412 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.422503 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.462224 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.555567 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.767990 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.810542 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.829802 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.848436 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.856856 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7679bd97df-bkcg8"] Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.911175 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:52:48 crc kubenswrapper[4786]: I0313 11:52:48.943795 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.014307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" event={"ID":"85917a9e-4613-4dce-b269-3d620b5eeccd","Type":"ContainerStarted","Data":"5e3d0a61c96d03cbdacdffebd551596f2a9d25f92ed798ef481aba28109293a0"} Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.074706 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.187365 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.413138 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.448916 4786 scope.go:117] "RemoveContainer" containerID="e0a1e18ee8aafe700e4f60ff68c55fe9c8250895937c59d24b1a877a73f5cec5" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.474588 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.542400 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.629794 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.785692 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.897030 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.897571 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 11:52:49 crc kubenswrapper[4786]: I0313 11:52:49.969036 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.020618 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" event={"ID":"85917a9e-4613-4dce-b269-3d620b5eeccd","Type":"ContainerStarted","Data":"b797dfd24c3603a3ac2dfb40c877d8029f603146c72c049762219fd40f32feac"} Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.021136 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.023321 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.023447 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1a0fb5d17d7031562621b557fb524c62ce31a5790e885c3e617e4a67cc631e51"} Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.042793 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.050076 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7679bd97df-bkcg8" podStartSLOduration=65.050057743 podStartE2EDuration="1m5.050057743s" podCreationTimestamp="2026-03-13 11:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:50.049051425 +0000 UTC m=+357.328704892" watchObservedRunningTime="2026-03-13 11:52:50.050057743 +0000 UTC m=+357.329711190" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.095197 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.157425 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.333029 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.352237 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.659170 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 11:52:50 crc kubenswrapper[4786]: I0313 11:52:50.822700 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 11:52:51 crc kubenswrapper[4786]: I0313 11:52:51.050214 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 11:52:51 crc kubenswrapper[4786]: I0313 11:52:51.894304 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.203577 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.487543 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.618475 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.618547 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634106 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634188 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634220 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634263 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634358 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634408 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634425 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.634509 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.635019 4786 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.635048 4786 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.635064 4786 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.635105 4786 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.642673 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.735898 4786 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4786]: I0313 11:52:52.998051 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.050989 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.051048 4786 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773" exitCode=137 Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.051093 4786 scope.go:117] "RemoveContainer" containerID="88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.051154 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.071161 4786 scope.go:117] "RemoveContainer" containerID="88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773" Mar 13 11:52:53 crc kubenswrapper[4786]: E0313 11:52:53.071592 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773\": container with ID starting with 88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773 not found: ID does not exist" containerID="88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.071659 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773"} err="failed to get container status \"88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773\": rpc error: code = NotFound desc = could not find container \"88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773\": container with ID starting with 88e6174e748f1caa42d0069464c75669d6c853ada8dc74a0b12c215d643f9773 not found: ID does not exist" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.264767 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 11:52:53 crc kubenswrapper[4786]: I0313 11:52:53.451298 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 11:53:03 crc kubenswrapper[4786]: I0313 11:53:03.033332 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 11:53:09 crc kubenswrapper[4786]: I0313 11:53:09.587332 4786 scope.go:117] "RemoveContainer" containerID="f925aa7a83d93ababe1dfb287e3daf22fc73fbba651824aaeffc36378c030fd6" Mar 13 11:53:09 crc kubenswrapper[4786]: I0313 11:53:09.612371 4786 scope.go:117] "RemoveContainer" containerID="a94797227b63cf10c8a7749433d6d7ea5bf678a25e1603a9fe463750e6d8f586" Mar 13 11:53:09 crc kubenswrapper[4786]: I0313 11:53:09.633626 4786 scope.go:117] "RemoveContainer" containerID="1410e3080da627328b7b14b18246efad872a5233d0e0f59a3412de33a872b302" Mar 13 11:53:09 crc kubenswrapper[4786]: I0313 11:53:09.653795 4786 scope.go:117] "RemoveContainer" containerID="3375f333484b1dea363aaaa30d681f7d13261278ce2fc33f05b94f1da86b297b" Mar 13 11:53:09 crc kubenswrapper[4786]: I0313 11:53:09.670596 4786 scope.go:117] "RemoveContainer" containerID="ddfe80ec8d3c09f652de8d618fd63e453c509e474736d42bbecbbad0f9205706" Mar 13 11:53:09 crc kubenswrapper[4786]: I0313 11:53:09.793159 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:53:14 crc kubenswrapper[4786]: I0313 11:53:14.183456 4786 generic.go:334] "Generic (PLEG): container finished" podID="5c09ab49-3d49-495b-af13-5fd937259b53" containerID="d9e2a18ec26abb77781ac10c014721ebaa13bc4e975384eb1fb6072464617c55" exitCode=0 Mar 13 11:53:14 crc kubenswrapper[4786]: I0313 11:53:14.183538 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" event={"ID":"5c09ab49-3d49-495b-af13-5fd937259b53","Type":"ContainerDied","Data":"d9e2a18ec26abb77781ac10c014721ebaa13bc4e975384eb1fb6072464617c55"} Mar 13 11:53:14 crc kubenswrapper[4786]: I0313 11:53:14.184531 4786 scope.go:117] "RemoveContainer" containerID="d9e2a18ec26abb77781ac10c014721ebaa13bc4e975384eb1fb6072464617c55" Mar 13 11:53:15 crc kubenswrapper[4786]: I0313 11:53:15.191730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" event={"ID":"5c09ab49-3d49-495b-af13-5fd937259b53","Type":"ContainerStarted","Data":"1ac2bd7182458904371963e95fc8dcdf34f60c62b67460dccd4b7cc08927b9c7"} Mar 13 11:53:15 crc kubenswrapper[4786]: I0313 11:53:15.192391 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:53:15 crc kubenswrapper[4786]: I0313 11:53:15.193683 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:53:55 crc kubenswrapper[4786]: I0313 11:53:55.847617 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2rscr"] Mar 13 11:53:55 crc kubenswrapper[4786]: I0313 11:53:55.850281 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:55 crc kubenswrapper[4786]: I0313 11:53:55.861867 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2rscr"] Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.044615 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.044703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-registry-tls\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.044739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9634c597-38a3-4eea-8914-9978afb8ee13-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.044844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9634c597-38a3-4eea-8914-9978afb8ee13-registry-certificates\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.045062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-bound-sa-token\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.045118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9634c597-38a3-4eea-8914-9978afb8ee13-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.045410 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzqpr\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-kube-api-access-xzqpr\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.045499 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9634c597-38a3-4eea-8914-9978afb8ee13-trusted-ca\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.074103 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.147069 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzqpr\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-kube-api-access-xzqpr\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.147158 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9634c597-38a3-4eea-8914-9978afb8ee13-trusted-ca\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.147226 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-registry-tls\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.147257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9634c597-38a3-4eea-8914-9978afb8ee13-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.147307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9634c597-38a3-4eea-8914-9978afb8ee13-registry-certificates\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.148186 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-bound-sa-token\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.148261 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9634c597-38a3-4eea-8914-9978afb8ee13-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.148932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9634c597-38a3-4eea-8914-9978afb8ee13-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.149457 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9634c597-38a3-4eea-8914-9978afb8ee13-registry-certificates\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.150150 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9634c597-38a3-4eea-8914-9978afb8ee13-trusted-ca\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.156863 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9634c597-38a3-4eea-8914-9978afb8ee13-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.157134 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-registry-tls\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.167713 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzqpr\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-kube-api-access-xzqpr\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.177020 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9634c597-38a3-4eea-8914-9978afb8ee13-bound-sa-token\") pod \"image-registry-66df7c8f76-2rscr\" (UID: \"9634c597-38a3-4eea-8914-9978afb8ee13\") " pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.469732 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:56 crc kubenswrapper[4786]: I0313 11:53:56.934581 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2rscr"] Mar 13 11:53:57 crc kubenswrapper[4786]: I0313 11:53:57.478718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" event={"ID":"9634c597-38a3-4eea-8914-9978afb8ee13","Type":"ContainerStarted","Data":"cdf0ed843da791cd51aa1fd1a80ce429fbbef29a1373a052f6bc245e963c1dd8"} Mar 13 11:53:57 crc kubenswrapper[4786]: I0313 11:53:57.479174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" event={"ID":"9634c597-38a3-4eea-8914-9978afb8ee13","Type":"ContainerStarted","Data":"e9a061b8d996caeb16178e6a18f84df7b377398a2a0311330588b43bff2fea6a"} Mar 13 11:53:57 crc kubenswrapper[4786]: I0313 11:53:57.479213 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:53:57 crc kubenswrapper[4786]: I0313 11:53:57.511798 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" podStartSLOduration=2.511773518 podStartE2EDuration="2.511773518s" podCreationTimestamp="2026-03-13 11:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:57.502209165 +0000 UTC m=+424.781862682" watchObservedRunningTime="2026-03-13 11:53:57.511773518 +0000 UTC m=+424.791426995" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.139210 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556714-pfqgz"] Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.140964 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.143221 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.143784 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.143914 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.146564 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-pfqgz"] Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.303785 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lddz\" (UniqueName: \"kubernetes.io/projected/a1c4fad7-502e-4920-8e7e-5e096b9f6653-kube-api-access-6lddz\") pod \"auto-csr-approver-29556714-pfqgz\" (UID: \"a1c4fad7-502e-4920-8e7e-5e096b9f6653\") " pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.405082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lddz\" (UniqueName: \"kubernetes.io/projected/a1c4fad7-502e-4920-8e7e-5e096b9f6653-kube-api-access-6lddz\") pod \"auto-csr-approver-29556714-pfqgz\" (UID: \"a1c4fad7-502e-4920-8e7e-5e096b9f6653\") " pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.442287 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lddz\" (UniqueName: \"kubernetes.io/projected/a1c4fad7-502e-4920-8e7e-5e096b9f6653-kube-api-access-6lddz\") pod \"auto-csr-approver-29556714-pfqgz\" (UID: \"a1c4fad7-502e-4920-8e7e-5e096b9f6653\") " pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.459594 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:00 crc kubenswrapper[4786]: I0313 11:54:00.906287 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-pfqgz"] Mar 13 11:54:01 crc kubenswrapper[4786]: I0313 11:54:01.505055 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" event={"ID":"a1c4fad7-502e-4920-8e7e-5e096b9f6653","Type":"ContainerStarted","Data":"a0f2c63520cd22cf65c0f8e5f3b5b8d537801be8d6e3ff67e2717683f5281e1a"} Mar 13 11:54:02 crc kubenswrapper[4786]: I0313 11:54:02.511455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" event={"ID":"a1c4fad7-502e-4920-8e7e-5e096b9f6653","Type":"ContainerStarted","Data":"6fc01c2b9c755543979a57ea696ee8afaa8ebb3f236447878ed5b226656059ec"} Mar 13 11:54:02 crc kubenswrapper[4786]: I0313 11:54:02.528106 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" podStartSLOduration=1.3992491679999999 podStartE2EDuration="2.528080305s" podCreationTimestamp="2026-03-13 11:54:00 +0000 UTC" firstStartedPulling="2026-03-13 11:54:00.914061704 +0000 UTC m=+428.193715161" lastFinishedPulling="2026-03-13 11:54:02.042892841 +0000 UTC m=+429.322546298" observedRunningTime="2026-03-13 11:54:02.522057888 +0000 UTC m=+429.801711335" watchObservedRunningTime="2026-03-13 11:54:02.528080305 +0000 UTC m=+429.807733742" Mar 13 11:54:03 crc kubenswrapper[4786]: I0313 11:54:03.519814 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1c4fad7-502e-4920-8e7e-5e096b9f6653" containerID="6fc01c2b9c755543979a57ea696ee8afaa8ebb3f236447878ed5b226656059ec" exitCode=0 Mar 13 11:54:03 crc kubenswrapper[4786]: I0313 11:54:03.519900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" event={"ID":"a1c4fad7-502e-4920-8e7e-5e096b9f6653","Type":"ContainerDied","Data":"6fc01c2b9c755543979a57ea696ee8afaa8ebb3f236447878ed5b226656059ec"} Mar 13 11:54:04 crc kubenswrapper[4786]: I0313 11:54:04.870714 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:04 crc kubenswrapper[4786]: I0313 11:54:04.968164 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lddz\" (UniqueName: \"kubernetes.io/projected/a1c4fad7-502e-4920-8e7e-5e096b9f6653-kube-api-access-6lddz\") pod \"a1c4fad7-502e-4920-8e7e-5e096b9f6653\" (UID: \"a1c4fad7-502e-4920-8e7e-5e096b9f6653\") " Mar 13 11:54:04 crc kubenswrapper[4786]: I0313 11:54:04.974336 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c4fad7-502e-4920-8e7e-5e096b9f6653-kube-api-access-6lddz" (OuterVolumeSpecName: "kube-api-access-6lddz") pod "a1c4fad7-502e-4920-8e7e-5e096b9f6653" (UID: "a1c4fad7-502e-4920-8e7e-5e096b9f6653"). InnerVolumeSpecName "kube-api-access-6lddz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:05 crc kubenswrapper[4786]: I0313 11:54:05.070770 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lddz\" (UniqueName: \"kubernetes.io/projected/a1c4fad7-502e-4920-8e7e-5e096b9f6653-kube-api-access-6lddz\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:05 crc kubenswrapper[4786]: I0313 11:54:05.537424 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" event={"ID":"a1c4fad7-502e-4920-8e7e-5e096b9f6653","Type":"ContainerDied","Data":"a0f2c63520cd22cf65c0f8e5f3b5b8d537801be8d6e3ff67e2717683f5281e1a"} Mar 13 11:54:05 crc kubenswrapper[4786]: I0313 11:54:05.537478 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0f2c63520cd22cf65c0f8e5f3b5b8d537801be8d6e3ff67e2717683f5281e1a" Mar 13 11:54:05 crc kubenswrapper[4786]: I0313 11:54:05.537530 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-pfqgz" Mar 13 11:54:08 crc kubenswrapper[4786]: I0313 11:54:08.171154 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:54:08 crc kubenswrapper[4786]: I0313 11:54:08.171423 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:54:16 crc kubenswrapper[4786]: I0313 11:54:16.474976 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2rscr" Mar 13 11:54:16 crc kubenswrapper[4786]: I0313 11:54:16.538639 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4rkw"] Mar 13 11:54:38 crc kubenswrapper[4786]: I0313 11:54:38.169997 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:54:38 crc kubenswrapper[4786]: I0313 11:54:38.170783 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:54:41 crc kubenswrapper[4786]: I0313 11:54:41.600417 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" podUID="525e850e-04a9-4dc1-91ab-a508136a5e60" containerName="registry" containerID="cri-o://6da584293915363685a1191078c482657bfd840a53e8c21dab386070ed6a0e5e" gracePeriod=30 Mar 13 11:54:41 crc kubenswrapper[4786]: I0313 11:54:41.796791 4786 generic.go:334] "Generic (PLEG): container finished" podID="525e850e-04a9-4dc1-91ab-a508136a5e60" containerID="6da584293915363685a1191078c482657bfd840a53e8c21dab386070ed6a0e5e" exitCode=0 Mar 13 11:54:41 crc kubenswrapper[4786]: I0313 11:54:41.796927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" event={"ID":"525e850e-04a9-4dc1-91ab-a508136a5e60","Type":"ContainerDied","Data":"6da584293915363685a1191078c482657bfd840a53e8c21dab386070ed6a0e5e"} Mar 13 11:54:41 crc kubenswrapper[4786]: I0313 11:54:41.957165 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.028782 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-bound-sa-token\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-certificates\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029225 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029267 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-tls\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029322 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-trusted-ca\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029348 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/525e850e-04a9-4dc1-91ab-a508136a5e60-installation-pull-secrets\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029374 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxd2z\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-kube-api-access-xxd2z\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.029396 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/525e850e-04a9-4dc1-91ab-a508136a5e60-ca-trust-extracted\") pod \"525e850e-04a9-4dc1-91ab-a508136a5e60\" (UID: \"525e850e-04a9-4dc1-91ab-a508136a5e60\") " Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.030192 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.030604 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.034995 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.035126 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525e850e-04a9-4dc1-91ab-a508136a5e60-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.037195 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-kube-api-access-xxd2z" (OuterVolumeSpecName: "kube-api-access-xxd2z") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "kube-api-access-xxd2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.040059 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.041082 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.049432 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525e850e-04a9-4dc1-91ab-a508136a5e60-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "525e850e-04a9-4dc1-91ab-a508136a5e60" (UID: "525e850e-04a9-4dc1-91ab-a508136a5e60"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130468 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130515 4786 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130530 4786 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130543 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525e850e-04a9-4dc1-91ab-a508136a5e60-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130553 4786 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/525e850e-04a9-4dc1-91ab-a508136a5e60-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130563 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxd2z\" (UniqueName: \"kubernetes.io/projected/525e850e-04a9-4dc1-91ab-a508136a5e60-kube-api-access-xxd2z\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.130572 4786 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/525e850e-04a9-4dc1-91ab-a508136a5e60-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.805747 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" event={"ID":"525e850e-04a9-4dc1-91ab-a508136a5e60","Type":"ContainerDied","Data":"ca14c9c79fb4546f60938a5650d957817ad2434123a8236adb1e0ead3ed98a44"} Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.805813 4786 scope.go:117] "RemoveContainer" containerID="6da584293915363685a1191078c482657bfd840a53e8c21dab386070ed6a0e5e" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.805812 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4rkw" Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.848544 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4rkw"] Mar 13 11:54:42 crc kubenswrapper[4786]: I0313 11:54:42.855630 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4rkw"] Mar 13 11:54:43 crc kubenswrapper[4786]: I0313 11:54:43.449053 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525e850e-04a9-4dc1-91ab-a508136a5e60" path="/var/lib/kubelet/pods/525e850e-04a9-4dc1-91ab-a508136a5e60/volumes" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.570569 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94x7m"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.573833 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94x7m" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="registry-server" containerID="cri-o://3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89" gracePeriod=30 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.581013 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tp7g"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.581787 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4tp7g" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="registry-server" containerID="cri-o://4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6" gracePeriod=30 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.588120 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-62rhz"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.588330 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" containerID="cri-o://1ac2bd7182458904371963e95fc8dcdf34f60c62b67460dccd4b7cc08927b9c7" gracePeriod=30 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.606137 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rl4k6"] Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.606646 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525e850e-04a9-4dc1-91ab-a508136a5e60" containerName="registry" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.606754 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="525e850e-04a9-4dc1-91ab-a508136a5e60" containerName="registry" Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.606826 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c4fad7-502e-4920-8e7e-5e096b9f6653" containerName="oc" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.606925 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c4fad7-502e-4920-8e7e-5e096b9f6653" containerName="oc" Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.611655 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.613227 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="525e850e-04a9-4dc1-91ab-a508136a5e60" containerName="registry" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.613388 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c4fad7-502e-4920-8e7e-5e096b9f6653" containerName="oc" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.615162 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.616690 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.619137 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.619220 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/certified-operators-94x7m" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="registry-server" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.626039 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmpwr"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.626510 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kmpwr" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="registry-server" containerID="cri-o://acd494a68da9d6b486c6956128882e3b9fab934308e2a3e919c981d8f69b5245" gracePeriod=30 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.632339 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rl4k6"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.636858 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gfdq"] Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.637438 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4gfdq" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="registry-server" containerID="cri-o://a1c805c41c287573eb6a6e548fdc662c312406b3b3e4b54ab80931782ddb58e7" gracePeriod=30 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.673929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjl2c\" (UniqueName: \"kubernetes.io/projected/f4cef03e-60f8-491b-9ba5-b93a42121b2e-kube-api-access-bjl2c\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.674209 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4cef03e-60f8-491b-9ba5-b93a42121b2e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.674229 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4cef03e-60f8-491b-9ba5-b93a42121b2e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.771583 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6 is running failed: container process not found" containerID="4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.772056 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6 is running failed: container process not found" containerID="4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.772286 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6 is running failed: container process not found" containerID="4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 11:54:53 crc kubenswrapper[4786]: E0313 11:54:53.772359 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-4tp7g" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="registry-server" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.775251 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjl2c\" (UniqueName: \"kubernetes.io/projected/f4cef03e-60f8-491b-9ba5-b93a42121b2e-kube-api-access-bjl2c\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.775314 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4cef03e-60f8-491b-9ba5-b93a42121b2e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.775342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4cef03e-60f8-491b-9ba5-b93a42121b2e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.776553 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4cef03e-60f8-491b-9ba5-b93a42121b2e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.782560 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4cef03e-60f8-491b-9ba5-b93a42121b2e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.791494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjl2c\" (UniqueName: \"kubernetes.io/projected/f4cef03e-60f8-491b-9ba5-b93a42121b2e-kube-api-access-bjl2c\") pod \"marketplace-operator-79b997595-rl4k6\" (UID: \"f4cef03e-60f8-491b-9ba5-b93a42121b2e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.901717 4786 generic.go:334] "Generic (PLEG): container finished" podID="1d680740-f193-4a69-8755-d766703cd61a" containerID="acd494a68da9d6b486c6956128882e3b9fab934308e2a3e919c981d8f69b5245" exitCode=0 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.901781 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmpwr" event={"ID":"1d680740-f193-4a69-8755-d766703cd61a","Type":"ContainerDied","Data":"acd494a68da9d6b486c6956128882e3b9fab934308e2a3e919c981d8f69b5245"} Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.908663 4786 generic.go:334] "Generic (PLEG): container finished" podID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerID="3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89" exitCode=0 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.908733 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerDied","Data":"3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89"} Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.934490 4786 generic.go:334] "Generic (PLEG): container finished" podID="17bbca1c-a838-4407-834c-45b6129b32b8" containerID="a1c805c41c287573eb6a6e548fdc662c312406b3b3e4b54ab80931782ddb58e7" exitCode=0 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.934559 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerDied","Data":"a1c805c41c287573eb6a6e548fdc662c312406b3b3e4b54ab80931782ddb58e7"} Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.936605 4786 generic.go:334] "Generic (PLEG): container finished" podID="939749d8-2927-47a2-8edc-77b4f307e813" containerID="4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6" exitCode=0 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.936701 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tp7g" event={"ID":"939749d8-2927-47a2-8edc-77b4f307e813","Type":"ContainerDied","Data":"4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6"} Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.938519 4786 generic.go:334] "Generic (PLEG): container finished" podID="5c09ab49-3d49-495b-af13-5fd937259b53" containerID="1ac2bd7182458904371963e95fc8dcdf34f60c62b67460dccd4b7cc08927b9c7" exitCode=0 Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.938546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" event={"ID":"5c09ab49-3d49-495b-af13-5fd937259b53","Type":"ContainerDied","Data":"1ac2bd7182458904371963e95fc8dcdf34f60c62b67460dccd4b7cc08927b9c7"} Mar 13 11:54:53 crc kubenswrapper[4786]: I0313 11:54:53.938571 4786 scope.go:117] "RemoveContainer" containerID="d9e2a18ec26abb77781ac10c014721ebaa13bc4e975384eb1fb6072464617c55" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.010785 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.021012 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.029009 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.033959 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.060710 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.077621 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-catalog-content\") pod \"1d680740-f193-4a69-8755-d766703cd61a\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.077933 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-utilities\") pod \"1d680740-f193-4a69-8755-d766703cd61a\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.078842 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-utilities\") pod \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.078977 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-catalog-content\") pod \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079061 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-catalog-content\") pod \"939749d8-2927-47a2-8edc-77b4f307e813\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079151 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzfjw\" (UniqueName: \"kubernetes.io/projected/f8353c7b-cabe-46a6-8a98-aea4bad6b499-kube-api-access-dzfjw\") pod \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\" (UID: \"f8353c7b-cabe-46a6-8a98-aea4bad6b499\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079242 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdgbk\" (UniqueName: \"kubernetes.io/projected/1d680740-f193-4a69-8755-d766703cd61a-kube-api-access-cdgbk\") pod \"1d680740-f193-4a69-8755-d766703cd61a\" (UID: \"1d680740-f193-4a69-8755-d766703cd61a\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079341 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-trusted-ca\") pod \"5c09ab49-3d49-495b-af13-5fd937259b53\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079446 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-utilities\") pod \"939749d8-2927-47a2-8edc-77b4f307e813\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079543 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9w9g\" (UniqueName: \"kubernetes.io/projected/939749d8-2927-47a2-8edc-77b4f307e813-kube-api-access-m9w9g\") pod \"939749d8-2927-47a2-8edc-77b4f307e813\" (UID: \"939749d8-2927-47a2-8edc-77b4f307e813\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079636 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dknd4\" (UniqueName: \"kubernetes.io/projected/5c09ab49-3d49-495b-af13-5fd937259b53-kube-api-access-dknd4\") pod \"5c09ab49-3d49-495b-af13-5fd937259b53\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079724 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-operator-metrics\") pod \"5c09ab49-3d49-495b-af13-5fd937259b53\" (UID: \"5c09ab49-3d49-495b-af13-5fd937259b53\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.079404 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-utilities" (OuterVolumeSpecName: "utilities") pod "1d680740-f193-4a69-8755-d766703cd61a" (UID: "1d680740-f193-4a69-8755-d766703cd61a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.080691 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-utilities" (OuterVolumeSpecName: "utilities") pod "939749d8-2927-47a2-8edc-77b4f307e813" (UID: "939749d8-2927-47a2-8edc-77b4f307e813"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.081530 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-utilities" (OuterVolumeSpecName: "utilities") pod "f8353c7b-cabe-46a6-8a98-aea4bad6b499" (UID: "f8353c7b-cabe-46a6-8a98-aea4bad6b499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.081727 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5c09ab49-3d49-495b-af13-5fd937259b53" (UID: "5c09ab49-3d49-495b-af13-5fd937259b53"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.081940 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.082061 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.082151 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.082231 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.083282 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c09ab49-3d49-495b-af13-5fd937259b53-kube-api-access-dknd4" (OuterVolumeSpecName: "kube-api-access-dknd4") pod "5c09ab49-3d49-495b-af13-5fd937259b53" (UID: "5c09ab49-3d49-495b-af13-5fd937259b53"). InnerVolumeSpecName "kube-api-access-dknd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.085107 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d680740-f193-4a69-8755-d766703cd61a-kube-api-access-cdgbk" (OuterVolumeSpecName: "kube-api-access-cdgbk") pod "1d680740-f193-4a69-8755-d766703cd61a" (UID: "1d680740-f193-4a69-8755-d766703cd61a"). InnerVolumeSpecName "kube-api-access-cdgbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.085180 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8353c7b-cabe-46a6-8a98-aea4bad6b499-kube-api-access-dzfjw" (OuterVolumeSpecName: "kube-api-access-dzfjw") pod "f8353c7b-cabe-46a6-8a98-aea4bad6b499" (UID: "f8353c7b-cabe-46a6-8a98-aea4bad6b499"). InnerVolumeSpecName "kube-api-access-dzfjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.086729 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5c09ab49-3d49-495b-af13-5fd937259b53" (UID: "5c09ab49-3d49-495b-af13-5fd937259b53"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.086925 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939749d8-2927-47a2-8edc-77b4f307e813-kube-api-access-m9w9g" (OuterVolumeSpecName: "kube-api-access-m9w9g") pod "939749d8-2927-47a2-8edc-77b4f307e813" (UID: "939749d8-2927-47a2-8edc-77b4f307e813"). InnerVolumeSpecName "kube-api-access-m9w9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.117168 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.117248 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d680740-f193-4a69-8755-d766703cd61a" (UID: "1d680740-f193-4a69-8755-d766703cd61a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.143583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8353c7b-cabe-46a6-8a98-aea4bad6b499" (UID: "f8353c7b-cabe-46a6-8a98-aea4bad6b499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.182993 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-utilities\") pod \"17bbca1c-a838-4407-834c-45b6129b32b8\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-catalog-content\") pod \"17bbca1c-a838-4407-834c-45b6129b32b8\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zh9k\" (UniqueName: \"kubernetes.io/projected/17bbca1c-a838-4407-834c-45b6129b32b8-kube-api-access-4zh9k\") pod \"17bbca1c-a838-4407-834c-45b6129b32b8\" (UID: \"17bbca1c-a838-4407-834c-45b6129b32b8\") " Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183569 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdgbk\" (UniqueName: \"kubernetes.io/projected/1d680740-f193-4a69-8755-d766703cd61a-kube-api-access-cdgbk\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183599 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9w9g\" (UniqueName: \"kubernetes.io/projected/939749d8-2927-47a2-8edc-77b4f307e813-kube-api-access-m9w9g\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183608 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dknd4\" (UniqueName: \"kubernetes.io/projected/5c09ab49-3d49-495b-af13-5fd937259b53-kube-api-access-dknd4\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183617 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c09ab49-3d49-495b-af13-5fd937259b53-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183626 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d680740-f193-4a69-8755-d766703cd61a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183635 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8353c7b-cabe-46a6-8a98-aea4bad6b499-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.183643 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzfjw\" (UniqueName: \"kubernetes.io/projected/f8353c7b-cabe-46a6-8a98-aea4bad6b499-kube-api-access-dzfjw\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.185833 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-utilities" (OuterVolumeSpecName: "utilities") pod "17bbca1c-a838-4407-834c-45b6129b32b8" (UID: "17bbca1c-a838-4407-834c-45b6129b32b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.186917 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17bbca1c-a838-4407-834c-45b6129b32b8-kube-api-access-4zh9k" (OuterVolumeSpecName: "kube-api-access-4zh9k") pod "17bbca1c-a838-4407-834c-45b6129b32b8" (UID: "17bbca1c-a838-4407-834c-45b6129b32b8"). InnerVolumeSpecName "kube-api-access-4zh9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.190937 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "939749d8-2927-47a2-8edc-77b4f307e813" (UID: "939749d8-2927-47a2-8edc-77b4f307e813"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.285214 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939749d8-2927-47a2-8edc-77b4f307e813-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.285266 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zh9k\" (UniqueName: \"kubernetes.io/projected/17bbca1c-a838-4407-834c-45b6129b32b8-kube-api-access-4zh9k\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.285283 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.304771 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17bbca1c-a838-4407-834c-45b6129b32b8" (UID: "17bbca1c-a838-4407-834c-45b6129b32b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.386253 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bbca1c-a838-4407-834c-45b6129b32b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.468310 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rl4k6"] Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.945991 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmpwr" event={"ID":"1d680740-f193-4a69-8755-d766703cd61a","Type":"ContainerDied","Data":"d86d7b7d29aa0acae548f9e0163b3ce579d57e2eefb905b6b761f88af6bba6c4"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.946029 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmpwr" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.946042 4786 scope.go:117] "RemoveContainer" containerID="acd494a68da9d6b486c6956128882e3b9fab934308e2a3e919c981d8f69b5245" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.948147 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" event={"ID":"f4cef03e-60f8-491b-9ba5-b93a42121b2e","Type":"ContainerStarted","Data":"1c7e0332ff96fa53f94f9d6473976d58ed0f7d7a7a498c975837c6b66b825a9a"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.948171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" event={"ID":"f4cef03e-60f8-491b-9ba5-b93a42121b2e","Type":"ContainerStarted","Data":"0c76a35f0acf0126e6fd8da65087b8ffd14a2e3e435ec4e8f29ea5c8584cdf25"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.949727 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.952648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94x7m" event={"ID":"f8353c7b-cabe-46a6-8a98-aea4bad6b499","Type":"ContainerDied","Data":"dd283f372163a5d004c4ac6cc08d9aca038e911762e37ff1f5dd3c483ed4d68a"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.952678 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94x7m" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.956022 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gfdq" event={"ID":"17bbca1c-a838-4407-834c-45b6129b32b8","Type":"ContainerDied","Data":"752fb6ed5fa450574193c3c8f1c4eb7256533cb3a6159d330dea19fb4c08255c"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.956108 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gfdq" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.958319 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.959856 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tp7g" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.959868 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tp7g" event={"ID":"939749d8-2927-47a2-8edc-77b4f307e813","Type":"ContainerDied","Data":"a59162bb35190dc1251c09f88ad34faf7fd44d19291aa1b534f6560aea04f927"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.961027 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" event={"ID":"5c09ab49-3d49-495b-af13-5fd937259b53","Type":"ContainerDied","Data":"e6aacd243e897fa7a5b3d743fc9a245fb30e96f1e7b8189810ee944c40028af0"} Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.961069 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-62rhz" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.970498 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rl4k6" podStartSLOduration=1.970477129 podStartE2EDuration="1.970477129s" podCreationTimestamp="2026-03-13 11:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:54:54.968172764 +0000 UTC m=+482.247826221" watchObservedRunningTime="2026-03-13 11:54:54.970477129 +0000 UTC m=+482.250130586" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.972259 4786 scope.go:117] "RemoveContainer" containerID="cf81364e364eb6f6cfcd50e6e48583773099a95819226540d6841a177e2131a9" Mar 13 11:54:54 crc kubenswrapper[4786]: I0313 11:54:54.984602 4786 scope.go:117] "RemoveContainer" containerID="4d2635fac95e0680bdf9c0ad1ee7a1a8cf22957d4d1862e0b43dfc3ff765a6c0" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.011763 4786 scope.go:117] "RemoveContainer" containerID="3c458de7ab7dff8cb9473785e2e04d3b9b69180295a3114a33277daa2dd5af89" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.029896 4786 scope.go:117] "RemoveContainer" containerID="d683800f8751c675fed57a968f8be2ca7dd6fbbabd9392bd02e31079970554d1" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.038094 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tp7g"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.047798 4786 scope.go:117] "RemoveContainer" containerID="775192d48c7e0e41b5b9166f5544b91349e6cb1867f234cd246ed08574a9cd24" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.052413 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4tp7g"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.076042 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmpwr"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.087514 4786 scope.go:117] "RemoveContainer" containerID="a1c805c41c287573eb6a6e548fdc662c312406b3b3e4b54ab80931782ddb58e7" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.090347 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmpwr"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.096207 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gfdq"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.101561 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4gfdq"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.104617 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94x7m"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.104716 4786 scope.go:117] "RemoveContainer" containerID="931bdea50fca736cfc8b71a47e129ef79b58f00021abeb2b983a2c39e96fc451" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.107296 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94x7m"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.110047 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-62rhz"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.112456 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-62rhz"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.121640 4786 scope.go:117] "RemoveContainer" containerID="f3011311a409ea03706c0bafec45ae414f9caa64b5fd14e8b44f9f67923514bd" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.132493 4786 scope.go:117] "RemoveContainer" containerID="4da916a6b834ed07a2f5b3a8c06daba99b866c5bb6594e682140fc83851293e6" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.152363 4786 scope.go:117] "RemoveContainer" containerID="7f9708bfe361fbd8f2c02fee29ecd426f04e52a177788f20f6e6fed6e33b2eb8" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.170152 4786 scope.go:117] "RemoveContainer" containerID="46563af448b1b26370749e0671a79020737c4274364c9f2f18f70e237f268b80" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.182049 4786 scope.go:117] "RemoveContainer" containerID="1ac2bd7182458904371963e95fc8dcdf34f60c62b67460dccd4b7cc08927b9c7" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.464085 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" path="/var/lib/kubelet/pods/17bbca1c-a838-4407-834c-45b6129b32b8/volumes" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.465660 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d680740-f193-4a69-8755-d766703cd61a" path="/var/lib/kubelet/pods/1d680740-f193-4a69-8755-d766703cd61a/volumes" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.466984 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" path="/var/lib/kubelet/pods/5c09ab49-3d49-495b-af13-5fd937259b53/volumes" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.468799 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939749d8-2927-47a2-8edc-77b4f307e813" path="/var/lib/kubelet/pods/939749d8-2927-47a2-8edc-77b4f307e813/volumes" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.470514 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" path="/var/lib/kubelet/pods/f8353c7b-cabe-46a6-8a98-aea4bad6b499/volumes" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785454 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cwtl5"] Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785727 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785744 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785754 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785762 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785777 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785786 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785796 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785802 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785812 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785819 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785828 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785835 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785845 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785851 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785859 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785865 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785873 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785879 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="extract-content" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785910 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785917 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785926 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785933 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785945 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785952 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="extract-utilities" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.785961 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.785967 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786063 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d680740-f193-4a69-8755-d766703cd61a" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786070 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786080 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786088 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="17bbca1c-a838-4407-834c-45b6129b32b8" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786098 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="939749d8-2927-47a2-8edc-77b4f307e813" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786106 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8353c7b-cabe-46a6-8a98-aea4bad6b499" containerName="registry-server" Mar 13 11:54:55 crc kubenswrapper[4786]: E0313 11:54:55.786207 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786214 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c09ab49-3d49-495b-af13-5fd937259b53" containerName="marketplace-operator" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.786840 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.789143 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.790086 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cwtl5"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.800302 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jw58\" (UniqueName: \"kubernetes.io/projected/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-kube-api-access-2jw58\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.800370 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-catalog-content\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.800451 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-utilities\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.902031 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jw58\" (UniqueName: \"kubernetes.io/projected/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-kube-api-access-2jw58\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.902138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-catalog-content\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.902248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-utilities\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.902780 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-catalog-content\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.902993 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-utilities\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.926345 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jw58\" (UniqueName: \"kubernetes.io/projected/5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5-kube-api-access-2jw58\") pod \"redhat-marketplace-cwtl5\" (UID: \"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5\") " pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.983366 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjl8g"] Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.984639 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.987060 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:54:55 crc kubenswrapper[4786]: I0313 11:54:55.988949 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjl8g"] Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.003224 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f417ecf-9afd-4518-b54c-dbfedb17c67a-catalog-content\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.003296 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f417ecf-9afd-4518-b54c-dbfedb17c67a-utilities\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.003347 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqtd\" (UniqueName: \"kubernetes.io/projected/0f417ecf-9afd-4518-b54c-dbfedb17c67a-kube-api-access-cwqtd\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.105025 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f417ecf-9afd-4518-b54c-dbfedb17c67a-catalog-content\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.105088 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f417ecf-9afd-4518-b54c-dbfedb17c67a-utilities\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.105116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqtd\" (UniqueName: \"kubernetes.io/projected/0f417ecf-9afd-4518-b54c-dbfedb17c67a-kube-api-access-cwqtd\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.105556 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f417ecf-9afd-4518-b54c-dbfedb17c67a-catalog-content\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.105799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f417ecf-9afd-4518-b54c-dbfedb17c67a-utilities\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.113073 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.123121 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqtd\" (UniqueName: \"kubernetes.io/projected/0f417ecf-9afd-4518-b54c-dbfedb17c67a-kube-api-access-cwqtd\") pod \"certified-operators-jjl8g\" (UID: \"0f417ecf-9afd-4518-b54c-dbfedb17c67a\") " pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.306650 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.337195 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cwtl5"] Mar 13 11:54:56 crc kubenswrapper[4786]: W0313 11:54:56.347377 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6e6ad0_c921_4367_82d1_0f0c4c4bcba5.slice/crio-8db2c00e92f293f025be97ee088bd439973ee34eea1abc27c5cb8a5f36cbd2c0 WatchSource:0}: Error finding container 8db2c00e92f293f025be97ee088bd439973ee34eea1abc27c5cb8a5f36cbd2c0: Status 404 returned error can't find the container with id 8db2c00e92f293f025be97ee088bd439973ee34eea1abc27c5cb8a5f36cbd2c0 Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.479972 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjl8g"] Mar 13 11:54:56 crc kubenswrapper[4786]: W0313 11:54:56.487965 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f417ecf_9afd_4518_b54c_dbfedb17c67a.slice/crio-ffe3eb7e3b09ca1cb331e841692f4c37ac25befa1f0fe104f9c371a3b796f1f1 WatchSource:0}: Error finding container ffe3eb7e3b09ca1cb331e841692f4c37ac25befa1f0fe104f9c371a3b796f1f1: Status 404 returned error can't find the container with id ffe3eb7e3b09ca1cb331e841692f4c37ac25befa1f0fe104f9c371a3b796f1f1 Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.987408 4786 generic.go:334] "Generic (PLEG): container finished" podID="0f417ecf-9afd-4518-b54c-dbfedb17c67a" containerID="df97d2e738ae538d7e690c9eaeacaf39c1143964b37c99b8f1924442a4a2102e" exitCode=0 Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.987557 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl8g" event={"ID":"0f417ecf-9afd-4518-b54c-dbfedb17c67a","Type":"ContainerDied","Data":"df97d2e738ae538d7e690c9eaeacaf39c1143964b37c99b8f1924442a4a2102e"} Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.987606 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl8g" event={"ID":"0f417ecf-9afd-4518-b54c-dbfedb17c67a","Type":"ContainerStarted","Data":"ffe3eb7e3b09ca1cb331e841692f4c37ac25befa1f0fe104f9c371a3b796f1f1"} Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.991670 4786 generic.go:334] "Generic (PLEG): container finished" podID="5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5" containerID="242ef43ae06a3ab430d1c5288a3b2ca18e61d708d3c3f4adaf3816e145d3ac2b" exitCode=0 Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.991823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cwtl5" event={"ID":"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5","Type":"ContainerDied","Data":"242ef43ae06a3ab430d1c5288a3b2ca18e61d708d3c3f4adaf3816e145d3ac2b"} Mar 13 11:54:56 crc kubenswrapper[4786]: I0313 11:54:56.992137 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cwtl5" event={"ID":"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5","Type":"ContainerStarted","Data":"8db2c00e92f293f025be97ee088bd439973ee34eea1abc27c5cb8a5f36cbd2c0"} Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.002068 4786 generic.go:334] "Generic (PLEG): container finished" podID="5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5" containerID="a4a63a01e005a609e7044209ef7ba5dcaf28db41289e7966ab15cd59001c770b" exitCode=0 Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.002162 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cwtl5" event={"ID":"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5","Type":"ContainerDied","Data":"a4a63a01e005a609e7044209ef7ba5dcaf28db41289e7966ab15cd59001c770b"} Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.006666 4786 generic.go:334] "Generic (PLEG): container finished" podID="0f417ecf-9afd-4518-b54c-dbfedb17c67a" containerID="89946892fb18c68dd3d6d71dad8a0cab2902bb468d330bfdd4b4fd8664dd8fa7" exitCode=0 Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.006694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl8g" event={"ID":"0f417ecf-9afd-4518-b54c-dbfedb17c67a","Type":"ContainerDied","Data":"89946892fb18c68dd3d6d71dad8a0cab2902bb468d330bfdd4b4fd8664dd8fa7"} Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.178564 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbxlz"] Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.181383 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.189052 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbxlz"] Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.196413 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.229175 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8284a0-87cc-4a4f-9498-f7103367855e-utilities\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.229223 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8284a0-87cc-4a4f-9498-f7103367855e-catalog-content\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.229268 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct9tp\" (UniqueName: \"kubernetes.io/projected/0a8284a0-87cc-4a4f-9498-f7103367855e-kube-api-access-ct9tp\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.329666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct9tp\" (UniqueName: \"kubernetes.io/projected/0a8284a0-87cc-4a4f-9498-f7103367855e-kube-api-access-ct9tp\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.329740 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8284a0-87cc-4a4f-9498-f7103367855e-utilities\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.329765 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8284a0-87cc-4a4f-9498-f7103367855e-catalog-content\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.330226 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8284a0-87cc-4a4f-9498-f7103367855e-catalog-content\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.330680 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8284a0-87cc-4a4f-9498-f7103367855e-utilities\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.352840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct9tp\" (UniqueName: \"kubernetes.io/projected/0a8284a0-87cc-4a4f-9498-f7103367855e-kube-api-access-ct9tp\") pod \"community-operators-dbxlz\" (UID: \"0a8284a0-87cc-4a4f-9498-f7103367855e\") " pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.386746 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vv77w"] Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.387910 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.392714 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.399224 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vv77w"] Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.430700 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmd7\" (UniqueName: \"kubernetes.io/projected/ae33a694-0398-4129-9926-1b6dcb6ecc40-kube-api-access-lsmd7\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.430750 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae33a694-0398-4129-9926-1b6dcb6ecc40-catalog-content\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.430815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae33a694-0398-4129-9926-1b6dcb6ecc40-utilities\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.522040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.531710 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae33a694-0398-4129-9926-1b6dcb6ecc40-utilities\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.531819 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsmd7\" (UniqueName: \"kubernetes.io/projected/ae33a694-0398-4129-9926-1b6dcb6ecc40-kube-api-access-lsmd7\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.531860 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae33a694-0398-4129-9926-1b6dcb6ecc40-catalog-content\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.532240 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae33a694-0398-4129-9926-1b6dcb6ecc40-utilities\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.532278 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae33a694-0398-4129-9926-1b6dcb6ecc40-catalog-content\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.560053 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsmd7\" (UniqueName: \"kubernetes.io/projected/ae33a694-0398-4129-9926-1b6dcb6ecc40-kube-api-access-lsmd7\") pod \"redhat-operators-vv77w\" (UID: \"ae33a694-0398-4129-9926-1b6dcb6ecc40\") " pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:54:58 crc kubenswrapper[4786]: I0313 11:54:58.718249 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:54:58.818540 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbxlz"] Mar 13 11:55:03 crc kubenswrapper[4786]: W0313 11:54:58.824569 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8284a0_87cc_4a4f_9498_f7103367855e.slice/crio-9602e56ce7853e7e8a26d9197d9211cbff5b5e129ed7fb9e4f8edaa6d250853c WatchSource:0}: Error finding container 9602e56ce7853e7e8a26d9197d9211cbff5b5e129ed7fb9e4f8edaa6d250853c: Status 404 returned error can't find the container with id 9602e56ce7853e7e8a26d9197d9211cbff5b5e129ed7fb9e4f8edaa6d250853c Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:54:59.013252 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbxlz" event={"ID":"0a8284a0-87cc-4a4f-9498-f7103367855e","Type":"ContainerStarted","Data":"9602e56ce7853e7e8a26d9197d9211cbff5b5e129ed7fb9e4f8edaa6d250853c"} Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:00.020376 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a8284a0-87cc-4a4f-9498-f7103367855e" containerID="9cef225e70514003271405fe55a33ef719e63fe4549b1afadcb3a4e29215af12" exitCode=0 Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:00.020533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbxlz" event={"ID":"0a8284a0-87cc-4a4f-9498-f7103367855e","Type":"ContainerDied","Data":"9cef225e70514003271405fe55a33ef719e63fe4549b1afadcb3a4e29215af12"} Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:01.030537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl8g" event={"ID":"0f417ecf-9afd-4518-b54c-dbfedb17c67a","Type":"ContainerStarted","Data":"3a205d796b32d41eb6a7c9bc533ea06f388ba6c80cff465f97b85d1b4b4670d0"} Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:01.061439 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjl8g" podStartSLOduration=2.989038352 podStartE2EDuration="6.061411286s" podCreationTimestamp="2026-03-13 11:54:55 +0000 UTC" firstStartedPulling="2026-03-13 11:54:56.989064957 +0000 UTC m=+484.268718434" lastFinishedPulling="2026-03-13 11:55:00.061437881 +0000 UTC m=+487.341091368" observedRunningTime="2026-03-13 11:55:01.061269852 +0000 UTC m=+488.340923329" watchObservedRunningTime="2026-03-13 11:55:01.061411286 +0000 UTC m=+488.341064763" Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:02.036902 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cwtl5" event={"ID":"5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5","Type":"ContainerStarted","Data":"58744c87f09b891c912fad60196bb8c47046f58c35537ea64a902dc0cb328694"} Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:02.066564 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cwtl5" podStartSLOduration=2.875087985 podStartE2EDuration="7.066544394s" podCreationTimestamp="2026-03-13 11:54:55 +0000 UTC" firstStartedPulling="2026-03-13 11:54:56.994507119 +0000 UTC m=+484.274160606" lastFinishedPulling="2026-03-13 11:55:01.185963568 +0000 UTC m=+488.465617015" observedRunningTime="2026-03-13 11:55:02.063405717 +0000 UTC m=+489.343059164" watchObservedRunningTime="2026-03-13 11:55:02.066544394 +0000 UTC m=+489.346197841" Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:03.055120 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a8284a0-87cc-4a4f-9498-f7103367855e" containerID="8e358e892c7b66e321fe95b006df1dce3050b7f54cdffd46eae5c3bfee3e9c06" exitCode=0 Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:03.055283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbxlz" event={"ID":"0a8284a0-87cc-4a4f-9498-f7103367855e","Type":"ContainerDied","Data":"8e358e892c7b66e321fe95b006df1dce3050b7f54cdffd46eae5c3bfee3e9c06"} Mar 13 11:55:03 crc kubenswrapper[4786]: I0313 11:55:03.577872 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vv77w"] Mar 13 11:55:03 crc kubenswrapper[4786]: W0313 11:55:03.583635 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae33a694_0398_4129_9926_1b6dcb6ecc40.slice/crio-d5cece6735541fd9419eac6aa34bf1ac2b1ee916bebfa2e6e7d9917a97733be6 WatchSource:0}: Error finding container d5cece6735541fd9419eac6aa34bf1ac2b1ee916bebfa2e6e7d9917a97733be6: Status 404 returned error can't find the container with id d5cece6735541fd9419eac6aa34bf1ac2b1ee916bebfa2e6e7d9917a97733be6 Mar 13 11:55:04 crc kubenswrapper[4786]: I0313 11:55:04.064908 4786 generic.go:334] "Generic (PLEG): container finished" podID="ae33a694-0398-4129-9926-1b6dcb6ecc40" containerID="c2a500f00ee1650b402b9bb8daee72911005ec3b23ac5061698915527ebdd8a2" exitCode=0 Mar 13 11:55:04 crc kubenswrapper[4786]: I0313 11:55:04.064968 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv77w" event={"ID":"ae33a694-0398-4129-9926-1b6dcb6ecc40","Type":"ContainerDied","Data":"c2a500f00ee1650b402b9bb8daee72911005ec3b23ac5061698915527ebdd8a2"} Mar 13 11:55:04 crc kubenswrapper[4786]: I0313 11:55:04.065082 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv77w" event={"ID":"ae33a694-0398-4129-9926-1b6dcb6ecc40","Type":"ContainerStarted","Data":"d5cece6735541fd9419eac6aa34bf1ac2b1ee916bebfa2e6e7d9917a97733be6"} Mar 13 11:55:05 crc kubenswrapper[4786]: I0313 11:55:05.073148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbxlz" event={"ID":"0a8284a0-87cc-4a4f-9498-f7103367855e","Type":"ContainerStarted","Data":"fa702dc593b751fd99de86f9474df5c2aaa7554ff92e761e9d7976cdca882f5e"} Mar 13 11:55:05 crc kubenswrapper[4786]: I0313 11:55:05.102683 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbxlz" podStartSLOduration=3.085119146 podStartE2EDuration="7.102659397s" podCreationTimestamp="2026-03-13 11:54:58 +0000 UTC" firstStartedPulling="2026-03-13 11:55:00.059275951 +0000 UTC m=+487.338929428" lastFinishedPulling="2026-03-13 11:55:04.076816222 +0000 UTC m=+491.356469679" observedRunningTime="2026-03-13 11:55:05.098133091 +0000 UTC m=+492.377786578" watchObservedRunningTime="2026-03-13 11:55:05.102659397 +0000 UTC m=+492.382312864" Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.080874 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv77w" event={"ID":"ae33a694-0398-4129-9926-1b6dcb6ecc40","Type":"ContainerStarted","Data":"ca9484a5d78b921e2b55dca4be1e46e56947b54de562e3b2af567d377c153947"} Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.113344 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.113395 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.151481 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.306911 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.306973 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:55:06 crc kubenswrapper[4786]: I0313 11:55:06.355347 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:55:07 crc kubenswrapper[4786]: I0313 11:55:07.092256 4786 generic.go:334] "Generic (PLEG): container finished" podID="ae33a694-0398-4129-9926-1b6dcb6ecc40" containerID="ca9484a5d78b921e2b55dca4be1e46e56947b54de562e3b2af567d377c153947" exitCode=0 Mar 13 11:55:07 crc kubenswrapper[4786]: I0313 11:55:07.092353 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv77w" event={"ID":"ae33a694-0398-4129-9926-1b6dcb6ecc40","Type":"ContainerDied","Data":"ca9484a5d78b921e2b55dca4be1e46e56947b54de562e3b2af567d377c153947"} Mar 13 11:55:07 crc kubenswrapper[4786]: I0313 11:55:07.161910 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjl8g" Mar 13 11:55:07 crc kubenswrapper[4786]: I0313 11:55:07.162965 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cwtl5" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.101356 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv77w" event={"ID":"ae33a694-0398-4129-9926-1b6dcb6ecc40","Type":"ContainerStarted","Data":"ef578c70c225c928d08010bbe869d06de896a7a39c36ce144bbf76fd355ed4f8"} Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.131383 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vv77w" podStartSLOduration=6.694387266 podStartE2EDuration="10.131357283s" podCreationTimestamp="2026-03-13 11:54:58 +0000 UTC" firstStartedPulling="2026-03-13 11:55:04.075316279 +0000 UTC m=+491.354969736" lastFinishedPulling="2026-03-13 11:55:07.512286296 +0000 UTC m=+494.791939753" observedRunningTime="2026-03-13 11:55:08.126114257 +0000 UTC m=+495.405767734" watchObservedRunningTime="2026-03-13 11:55:08.131357283 +0000 UTC m=+495.411010770" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.169305 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.169396 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.169502 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.170579 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"916375dbd60d5646bdf04f7b3ff54e5cebfbc453558600eb317d36e2a6093d7a"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.170720 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://916375dbd60d5646bdf04f7b3ff54e5cebfbc453558600eb317d36e2a6093d7a" gracePeriod=600 Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.522762 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.523122 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.574486 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.718781 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:55:08 crc kubenswrapper[4786]: I0313 11:55:08.718825 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.107838 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="916375dbd60d5646bdf04f7b3ff54e5cebfbc453558600eb317d36e2a6093d7a" exitCode=0 Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.107915 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"916375dbd60d5646bdf04f7b3ff54e5cebfbc453558600eb317d36e2a6093d7a"} Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.108160 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"547b00ed88c3949b89b12da8316f4066352f35b5af45e0f69411a7f36910a357"} Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.108180 4786 scope.go:117] "RemoveContainer" containerID="851b4bf4a13d73a7816d3daa50f20bbac074019ab9f5f82fb45568d253d450d3" Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.151753 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbxlz" Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.739056 4786 scope.go:117] "RemoveContainer" containerID="faac95883f46a9a9037a00847e6dbb43a5bd987b6aa9c744f35004fa2e724efa" Mar 13 11:55:09 crc kubenswrapper[4786]: I0313 11:55:09.778840 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vv77w" podUID="ae33a694-0398-4129-9926-1b6dcb6ecc40" containerName="registry-server" probeResult="failure" output=< Mar 13 11:55:09 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 11:55:09 crc kubenswrapper[4786]: > Mar 13 11:55:18 crc kubenswrapper[4786]: I0313 11:55:18.775910 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:55:18 crc kubenswrapper[4786]: I0313 11:55:18.846372 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vv77w" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.140641 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556716-qc8sr"] Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.143710 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.146372 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.146446 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.151854 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.153054 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-qc8sr"] Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.319034 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxgc\" (UniqueName: \"kubernetes.io/projected/cbd421e3-789b-475c-9a4d-b20bc95f15bf-kube-api-access-nqxgc\") pod \"auto-csr-approver-29556716-qc8sr\" (UID: \"cbd421e3-789b-475c-9a4d-b20bc95f15bf\") " pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.421211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxgc\" (UniqueName: \"kubernetes.io/projected/cbd421e3-789b-475c-9a4d-b20bc95f15bf-kube-api-access-nqxgc\") pod \"auto-csr-approver-29556716-qc8sr\" (UID: \"cbd421e3-789b-475c-9a4d-b20bc95f15bf\") " pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.448413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxgc\" (UniqueName: \"kubernetes.io/projected/cbd421e3-789b-475c-9a4d-b20bc95f15bf-kube-api-access-nqxgc\") pod \"auto-csr-approver-29556716-qc8sr\" (UID: \"cbd421e3-789b-475c-9a4d-b20bc95f15bf\") " pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.471440 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.673119 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-qc8sr"] Mar 13 11:56:00 crc kubenswrapper[4786]: I0313 11:56:00.686287 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 11:56:01 crc kubenswrapper[4786]: I0313 11:56:01.472498 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" event={"ID":"cbd421e3-789b-475c-9a4d-b20bc95f15bf","Type":"ContainerStarted","Data":"99c0f6bd4bd16fb787639cd6834f1ccb0abed15f89147f08cf2ace50d7c7a86b"} Mar 13 11:56:02 crc kubenswrapper[4786]: I0313 11:56:02.481994 4786 generic.go:334] "Generic (PLEG): container finished" podID="cbd421e3-789b-475c-9a4d-b20bc95f15bf" containerID="2f3e95662904899e924a6719689e2bed3be873fcc70e0d2706572a2224ae5d93" exitCode=0 Mar 13 11:56:02 crc kubenswrapper[4786]: I0313 11:56:02.482067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" event={"ID":"cbd421e3-789b-475c-9a4d-b20bc95f15bf","Type":"ContainerDied","Data":"2f3e95662904899e924a6719689e2bed3be873fcc70e0d2706572a2224ae5d93"} Mar 13 11:56:03 crc kubenswrapper[4786]: I0313 11:56:03.723679 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:03 crc kubenswrapper[4786]: I0313 11:56:03.870631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqxgc\" (UniqueName: \"kubernetes.io/projected/cbd421e3-789b-475c-9a4d-b20bc95f15bf-kube-api-access-nqxgc\") pod \"cbd421e3-789b-475c-9a4d-b20bc95f15bf\" (UID: \"cbd421e3-789b-475c-9a4d-b20bc95f15bf\") " Mar 13 11:56:03 crc kubenswrapper[4786]: I0313 11:56:03.880131 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd421e3-789b-475c-9a4d-b20bc95f15bf-kube-api-access-nqxgc" (OuterVolumeSpecName: "kube-api-access-nqxgc") pod "cbd421e3-789b-475c-9a4d-b20bc95f15bf" (UID: "cbd421e3-789b-475c-9a4d-b20bc95f15bf"). InnerVolumeSpecName "kube-api-access-nqxgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:56:03 crc kubenswrapper[4786]: I0313 11:56:03.972659 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqxgc\" (UniqueName: \"kubernetes.io/projected/cbd421e3-789b-475c-9a4d-b20bc95f15bf-kube-api-access-nqxgc\") on node \"crc\" DevicePath \"\"" Mar 13 11:56:04 crc kubenswrapper[4786]: I0313 11:56:04.499174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" event={"ID":"cbd421e3-789b-475c-9a4d-b20bc95f15bf","Type":"ContainerDied","Data":"99c0f6bd4bd16fb787639cd6834f1ccb0abed15f89147f08cf2ace50d7c7a86b"} Mar 13 11:56:04 crc kubenswrapper[4786]: I0313 11:56:04.499250 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99c0f6bd4bd16fb787639cd6834f1ccb0abed15f89147f08cf2ace50d7c7a86b" Mar 13 11:56:04 crc kubenswrapper[4786]: I0313 11:56:04.499313 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-qc8sr" Mar 13 11:56:04 crc kubenswrapper[4786]: I0313 11:56:04.790043 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-bmgbc"] Mar 13 11:56:04 crc kubenswrapper[4786]: I0313 11:56:04.796080 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-bmgbc"] Mar 13 11:56:05 crc kubenswrapper[4786]: I0313 11:56:05.454848 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e610d5-a3b8-4fc8-a472-01ab5bb625d5" path="/var/lib/kubelet/pods/01e610d5-a3b8-4fc8-a472-01ab5bb625d5/volumes" Mar 13 11:57:08 crc kubenswrapper[4786]: I0313 11:57:08.169120 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:57:08 crc kubenswrapper[4786]: I0313 11:57:08.169666 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:57:38 crc kubenswrapper[4786]: I0313 11:57:38.170213 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:57:38 crc kubenswrapper[4786]: I0313 11:57:38.171200 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.153280 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556718-jvntt"] Mar 13 11:58:00 crc kubenswrapper[4786]: E0313 11:58:00.155644 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd421e3-789b-475c-9a4d-b20bc95f15bf" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.155826 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd421e3-789b-475c-9a4d-b20bc95f15bf" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.156184 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd421e3-789b-475c-9a4d-b20bc95f15bf" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.157118 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.160128 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.160559 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.161106 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.176666 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-jvntt"] Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.247853 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtq75\" (UniqueName: \"kubernetes.io/projected/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b-kube-api-access-qtq75\") pod \"auto-csr-approver-29556718-jvntt\" (UID: \"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b\") " pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.349023 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtq75\" (UniqueName: \"kubernetes.io/projected/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b-kube-api-access-qtq75\") pod \"auto-csr-approver-29556718-jvntt\" (UID: \"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b\") " pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.387105 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtq75\" (UniqueName: \"kubernetes.io/projected/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b-kube-api-access-qtq75\") pod \"auto-csr-approver-29556718-jvntt\" (UID: \"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b\") " pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.492679 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:00 crc kubenswrapper[4786]: I0313 11:58:00.811563 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-jvntt"] Mar 13 11:58:00 crc kubenswrapper[4786]: W0313 11:58:00.816933 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7390aaf_5c5b_4eb5_8a09_e40a5591da5b.slice/crio-459cbf805bf02806aa801b875a49256c165fffd0b267de81b5032148177370c1 WatchSource:0}: Error finding container 459cbf805bf02806aa801b875a49256c165fffd0b267de81b5032148177370c1: Status 404 returned error can't find the container with id 459cbf805bf02806aa801b875a49256c165fffd0b267de81b5032148177370c1 Mar 13 11:58:01 crc kubenswrapper[4786]: I0313 11:58:01.291673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-jvntt" event={"ID":"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b","Type":"ContainerStarted","Data":"459cbf805bf02806aa801b875a49256c165fffd0b267de81b5032148177370c1"} Mar 13 11:58:02 crc kubenswrapper[4786]: I0313 11:58:02.299176 4786 generic.go:334] "Generic (PLEG): container finished" podID="e7390aaf-5c5b-4eb5-8a09-e40a5591da5b" containerID="fb1823e793d40ba291d6da1843368104644d032300234a9ee48d8c92d161d6c5" exitCode=0 Mar 13 11:58:02 crc kubenswrapper[4786]: I0313 11:58:02.299233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-jvntt" event={"ID":"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b","Type":"ContainerDied","Data":"fb1823e793d40ba291d6da1843368104644d032300234a9ee48d8c92d161d6c5"} Mar 13 11:58:03 crc kubenswrapper[4786]: I0313 11:58:03.658395 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:03 crc kubenswrapper[4786]: I0313 11:58:03.742821 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtq75\" (UniqueName: \"kubernetes.io/projected/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b-kube-api-access-qtq75\") pod \"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b\" (UID: \"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b\") " Mar 13 11:58:03 crc kubenswrapper[4786]: I0313 11:58:03.751623 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b-kube-api-access-qtq75" (OuterVolumeSpecName: "kube-api-access-qtq75") pod "e7390aaf-5c5b-4eb5-8a09-e40a5591da5b" (UID: "e7390aaf-5c5b-4eb5-8a09-e40a5591da5b"). InnerVolumeSpecName "kube-api-access-qtq75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:58:03 crc kubenswrapper[4786]: I0313 11:58:03.844270 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtq75\" (UniqueName: \"kubernetes.io/projected/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b-kube-api-access-qtq75\") on node \"crc\" DevicePath \"\"" Mar 13 11:58:04 crc kubenswrapper[4786]: I0313 11:58:04.316420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-jvntt" event={"ID":"e7390aaf-5c5b-4eb5-8a09-e40a5591da5b","Type":"ContainerDied","Data":"459cbf805bf02806aa801b875a49256c165fffd0b267de81b5032148177370c1"} Mar 13 11:58:04 crc kubenswrapper[4786]: I0313 11:58:04.316477 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459cbf805bf02806aa801b875a49256c165fffd0b267de81b5032148177370c1" Mar 13 11:58:04 crc kubenswrapper[4786]: I0313 11:58:04.316484 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-jvntt" Mar 13 11:58:04 crc kubenswrapper[4786]: I0313 11:58:04.729393 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-wsvjm"] Mar 13 11:58:04 crc kubenswrapper[4786]: I0313 11:58:04.738945 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-wsvjm"] Mar 13 11:58:05 crc kubenswrapper[4786]: I0313 11:58:05.451480 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fccd37e-4dd9-43b1-9553-507adcc48841" path="/var/lib/kubelet/pods/0fccd37e-4dd9-43b1-9553-507adcc48841/volumes" Mar 13 11:58:08 crc kubenswrapper[4786]: I0313 11:58:08.169216 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:58:08 crc kubenswrapper[4786]: I0313 11:58:08.169555 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:58:08 crc kubenswrapper[4786]: I0313 11:58:08.169615 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 11:58:08 crc kubenswrapper[4786]: I0313 11:58:08.170315 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"547b00ed88c3949b89b12da8316f4066352f35b5af45e0f69411a7f36910a357"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:58:08 crc kubenswrapper[4786]: I0313 11:58:08.170412 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://547b00ed88c3949b89b12da8316f4066352f35b5af45e0f69411a7f36910a357" gracePeriod=600 Mar 13 11:58:09 crc kubenswrapper[4786]: I0313 11:58:09.359857 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="547b00ed88c3949b89b12da8316f4066352f35b5af45e0f69411a7f36910a357" exitCode=0 Mar 13 11:58:09 crc kubenswrapper[4786]: I0313 11:58:09.359935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"547b00ed88c3949b89b12da8316f4066352f35b5af45e0f69411a7f36910a357"} Mar 13 11:58:09 crc kubenswrapper[4786]: I0313 11:58:09.360328 4786 scope.go:117] "RemoveContainer" containerID="916375dbd60d5646bdf04f7b3ff54e5cebfbc453558600eb317d36e2a6093d7a" Mar 13 11:58:09 crc kubenswrapper[4786]: I0313 11:58:09.909211 4786 scope.go:117] "RemoveContainer" containerID="d624051a2ace35fa59b7c1dc6cf92f8ab0bd4c05854cfa4d29a57a0b17f95820" Mar 13 11:58:09 crc kubenswrapper[4786]: I0313 11:58:09.966213 4786 scope.go:117] "RemoveContainer" containerID="4adf9ddcb68c2f5c1789af829fd8b03591fc8522c100208f8726e9eb6c3d81b0" Mar 13 11:58:10 crc kubenswrapper[4786]: I0313 11:58:10.372252 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"53f9a9165f399ca75a5c5e665434b0714c4c497324b97b5da97227bbf25aa5b5"} Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.149318 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556720-zx752"] Mar 13 12:00:00 crc kubenswrapper[4786]: E0313 12:00:00.150102 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7390aaf-5c5b-4eb5-8a09-e40a5591da5b" containerName="oc" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.150114 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7390aaf-5c5b-4eb5-8a09-e40a5591da5b" containerName="oc" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.150213 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7390aaf-5c5b-4eb5-8a09-e40a5591da5b" containerName="oc" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.150581 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.153789 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.154347 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.154844 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.156706 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-zx752"] Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.240396 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l"] Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.241183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.243278 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.243539 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.247847 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l"] Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.252244 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7gp\" (UniqueName: \"kubernetes.io/projected/9ce0f979-89dc-435e-abf3-0a4a4102c338-kube-api-access-vw7gp\") pod \"auto-csr-approver-29556720-zx752\" (UID: \"9ce0f979-89dc-435e-abf3-0a4a4102c338\") " pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.352955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7gp\" (UniqueName: \"kubernetes.io/projected/9ce0f979-89dc-435e-abf3-0a4a4102c338-kube-api-access-vw7gp\") pod \"auto-csr-approver-29556720-zx752\" (UID: \"9ce0f979-89dc-435e-abf3-0a4a4102c338\") " pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.353010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-secret-volume\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.353042 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt49w\" (UniqueName: \"kubernetes.io/projected/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-kube-api-access-nt49w\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.353093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-config-volume\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.371317 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7gp\" (UniqueName: \"kubernetes.io/projected/9ce0f979-89dc-435e-abf3-0a4a4102c338-kube-api-access-vw7gp\") pod \"auto-csr-approver-29556720-zx752\" (UID: \"9ce0f979-89dc-435e-abf3-0a4a4102c338\") " pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.454616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-config-volume\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.454691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-secret-volume\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.454730 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt49w\" (UniqueName: \"kubernetes.io/projected/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-kube-api-access-nt49w\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.455844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-config-volume\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.459914 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-secret-volume\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.470730 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt49w\" (UniqueName: \"kubernetes.io/projected/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-kube-api-access-nt49w\") pod \"collect-profiles-29556720-hsr7l\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.475299 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.560326 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.685258 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-zx752"] Mar 13 12:00:00 crc kubenswrapper[4786]: I0313 12:00:00.782765 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l"] Mar 13 12:00:00 crc kubenswrapper[4786]: W0313 12:00:00.789312 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f2b1db_8b0a_414d_a5f1_bfc6f39ef7a5.slice/crio-16b5a369fbfdefc652393a873a5050e2cb26a8fe8a3b4ed001d017cfa428bc12 WatchSource:0}: Error finding container 16b5a369fbfdefc652393a873a5050e2cb26a8fe8a3b4ed001d017cfa428bc12: Status 404 returned error can't find the container with id 16b5a369fbfdefc652393a873a5050e2cb26a8fe8a3b4ed001d017cfa428bc12 Mar 13 12:00:01 crc kubenswrapper[4786]: I0313 12:00:01.175460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" event={"ID":"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5","Type":"ContainerStarted","Data":"3270d775451c2ba76a2d69b1754926133289fe70578f3f6a85a8844e38dc2860"} Mar 13 12:00:01 crc kubenswrapper[4786]: I0313 12:00:01.175502 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" event={"ID":"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5","Type":"ContainerStarted","Data":"16b5a369fbfdefc652393a873a5050e2cb26a8fe8a3b4ed001d017cfa428bc12"} Mar 13 12:00:01 crc kubenswrapper[4786]: I0313 12:00:01.176525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-zx752" event={"ID":"9ce0f979-89dc-435e-abf3-0a4a4102c338","Type":"ContainerStarted","Data":"5170f456466c4399ed0f3111e074ee40badb727b19c37beea9590dbd088313e0"} Mar 13 12:00:01 crc kubenswrapper[4786]: I0313 12:00:01.193649 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" podStartSLOduration=1.193624738 podStartE2EDuration="1.193624738s" podCreationTimestamp="2026-03-13 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:00:01.190356546 +0000 UTC m=+788.470010023" watchObservedRunningTime="2026-03-13 12:00:01.193624738 +0000 UTC m=+788.473278205" Mar 13 12:00:02 crc kubenswrapper[4786]: I0313 12:00:02.184373 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-zx752" event={"ID":"9ce0f979-89dc-435e-abf3-0a4a4102c338","Type":"ContainerStarted","Data":"800125b044ed448a929edb34f5731e31d2715a0df4bf5ac7f81b181924e5eabe"} Mar 13 12:00:02 crc kubenswrapper[4786]: I0313 12:00:02.186202 4786 generic.go:334] "Generic (PLEG): container finished" podID="37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" containerID="3270d775451c2ba76a2d69b1754926133289fe70578f3f6a85a8844e38dc2860" exitCode=0 Mar 13 12:00:02 crc kubenswrapper[4786]: I0313 12:00:02.186328 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" event={"ID":"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5","Type":"ContainerDied","Data":"3270d775451c2ba76a2d69b1754926133289fe70578f3f6a85a8844e38dc2860"} Mar 13 12:00:02 crc kubenswrapper[4786]: I0313 12:00:02.197859 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556720-zx752" podStartSLOduration=1.079163064 podStartE2EDuration="2.197840504s" podCreationTimestamp="2026-03-13 12:00:00 +0000 UTC" firstStartedPulling="2026-03-13 12:00:00.698752538 +0000 UTC m=+787.978405985" lastFinishedPulling="2026-03-13 12:00:01.817429938 +0000 UTC m=+789.097083425" observedRunningTime="2026-03-13 12:00:02.196870046 +0000 UTC m=+789.476523593" watchObservedRunningTime="2026-03-13 12:00:02.197840504 +0000 UTC m=+789.477493951" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.194633 4786 generic.go:334] "Generic (PLEG): container finished" podID="9ce0f979-89dc-435e-abf3-0a4a4102c338" containerID="800125b044ed448a929edb34f5731e31d2715a0df4bf5ac7f81b181924e5eabe" exitCode=0 Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.194746 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-zx752" event={"ID":"9ce0f979-89dc-435e-abf3-0a4a4102c338","Type":"ContainerDied","Data":"800125b044ed448a929edb34f5731e31d2715a0df4bf5ac7f81b181924e5eabe"} Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.480526 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.606173 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-config-volume\") pod \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.606272 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt49w\" (UniqueName: \"kubernetes.io/projected/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-kube-api-access-nt49w\") pod \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.606404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-secret-volume\") pod \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\" (UID: \"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5\") " Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.607098 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" (UID: "37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.611428 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" (UID: "37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.611423 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-kube-api-access-nt49w" (OuterVolumeSpecName: "kube-api-access-nt49w") pod "37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" (UID: "37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5"). InnerVolumeSpecName "kube-api-access-nt49w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.707746 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.707795 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt49w\" (UniqueName: \"kubernetes.io/projected/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-kube-api-access-nt49w\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:03 crc kubenswrapper[4786]: I0313 12:00:03.707815 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.205245 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.205262 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l" event={"ID":"37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5","Type":"ContainerDied","Data":"16b5a369fbfdefc652393a873a5050e2cb26a8fe8a3b4ed001d017cfa428bc12"} Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.205406 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16b5a369fbfdefc652393a873a5050e2cb26a8fe8a3b4ed001d017cfa428bc12" Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.393372 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.418452 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw7gp\" (UniqueName: \"kubernetes.io/projected/9ce0f979-89dc-435e-abf3-0a4a4102c338-kube-api-access-vw7gp\") pod \"9ce0f979-89dc-435e-abf3-0a4a4102c338\" (UID: \"9ce0f979-89dc-435e-abf3-0a4a4102c338\") " Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.423181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce0f979-89dc-435e-abf3-0a4a4102c338-kube-api-access-vw7gp" (OuterVolumeSpecName: "kube-api-access-vw7gp") pod "9ce0f979-89dc-435e-abf3-0a4a4102c338" (UID: "9ce0f979-89dc-435e-abf3-0a4a4102c338"). InnerVolumeSpecName "kube-api-access-vw7gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:04 crc kubenswrapper[4786]: I0313 12:00:04.520917 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw7gp\" (UniqueName: \"kubernetes.io/projected/9ce0f979-89dc-435e-abf3-0a4a4102c338-kube-api-access-vw7gp\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:05 crc kubenswrapper[4786]: I0313 12:00:05.216395 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-zx752" event={"ID":"9ce0f979-89dc-435e-abf3-0a4a4102c338","Type":"ContainerDied","Data":"5170f456466c4399ed0f3111e074ee40badb727b19c37beea9590dbd088313e0"} Mar 13 12:00:05 crc kubenswrapper[4786]: I0313 12:00:05.216451 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5170f456466c4399ed0f3111e074ee40badb727b19c37beea9590dbd088313e0" Mar 13 12:00:05 crc kubenswrapper[4786]: I0313 12:00:05.216506 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-zx752" Mar 13 12:00:05 crc kubenswrapper[4786]: I0313 12:00:05.252613 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-pfqgz"] Mar 13 12:00:05 crc kubenswrapper[4786]: I0313 12:00:05.258541 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-pfqgz"] Mar 13 12:00:05 crc kubenswrapper[4786]: I0313 12:00:05.449155 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c4fad7-502e-4920-8e7e-5e096b9f6653" path="/var/lib/kubelet/pods/a1c4fad7-502e-4920-8e7e-5e096b9f6653/volumes" Mar 13 12:00:10 crc kubenswrapper[4786]: I0313 12:00:10.061676 4786 scope.go:117] "RemoveContainer" containerID="6fc01c2b9c755543979a57ea696ee8afaa8ebb3f236447878ed5b226656059ec" Mar 13 12:00:38 crc kubenswrapper[4786]: I0313 12:00:38.169515 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:00:38 crc kubenswrapper[4786]: I0313 12:00:38.170201 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:00:49 crc kubenswrapper[4786]: I0313 12:00:49.263297 4786 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.169175 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.172121 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.220274 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zsf4r"] Mar 13 12:01:08 crc kubenswrapper[4786]: E0313 12:01:08.220642 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce0f979-89dc-435e-abf3-0a4a4102c338" containerName="oc" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.220671 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce0f979-89dc-435e-abf3-0a4a4102c338" containerName="oc" Mar 13 12:01:08 crc kubenswrapper[4786]: E0313 12:01:08.220715 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" containerName="collect-profiles" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.220728 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" containerName="collect-profiles" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.220940 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce0f979-89dc-435e-abf3-0a4a4102c338" containerName="oc" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.220972 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" containerName="collect-profiles" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.222271 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.233721 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zsf4r"] Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.235320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-utilities\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.235409 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48d9\" (UniqueName: \"kubernetes.io/projected/95fda17f-5c22-4321-b98f-c9d17221c3ee-kube-api-access-d48d9\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.235504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-catalog-content\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.337277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-utilities\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.337529 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48d9\" (UniqueName: \"kubernetes.io/projected/95fda17f-5c22-4321-b98f-c9d17221c3ee-kube-api-access-d48d9\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.337559 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-catalog-content\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.338040 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-catalog-content\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.338130 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-utilities\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.360307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48d9\" (UniqueName: \"kubernetes.io/projected/95fda17f-5c22-4321-b98f-c9d17221c3ee-kube-api-access-d48d9\") pod \"community-operators-zsf4r\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.585112 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:08 crc kubenswrapper[4786]: I0313 12:01:08.830226 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zsf4r"] Mar 13 12:01:09 crc kubenswrapper[4786]: I0313 12:01:09.652555 4786 generic.go:334] "Generic (PLEG): container finished" podID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerID="2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a" exitCode=0 Mar 13 12:01:09 crc kubenswrapper[4786]: I0313 12:01:09.652815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsf4r" event={"ID":"95fda17f-5c22-4321-b98f-c9d17221c3ee","Type":"ContainerDied","Data":"2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a"} Mar 13 12:01:09 crc kubenswrapper[4786]: I0313 12:01:09.652877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsf4r" event={"ID":"95fda17f-5c22-4321-b98f-c9d17221c3ee","Type":"ContainerStarted","Data":"316187aef8e6929fd1870c0fd12b8305605126696701c69a3a431043a3bcc25b"} Mar 13 12:01:09 crc kubenswrapper[4786]: I0313 12:01:09.655375 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:01:11 crc kubenswrapper[4786]: I0313 12:01:11.669156 4786 generic.go:334] "Generic (PLEG): container finished" podID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerID="939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902" exitCode=0 Mar 13 12:01:11 crc kubenswrapper[4786]: I0313 12:01:11.669376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsf4r" event={"ID":"95fda17f-5c22-4321-b98f-c9d17221c3ee","Type":"ContainerDied","Data":"939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902"} Mar 13 12:01:12 crc kubenswrapper[4786]: I0313 12:01:12.681370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsf4r" event={"ID":"95fda17f-5c22-4321-b98f-c9d17221c3ee","Type":"ContainerStarted","Data":"76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06"} Mar 13 12:01:12 crc kubenswrapper[4786]: I0313 12:01:12.711338 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zsf4r" podStartSLOduration=2.135148371 podStartE2EDuration="4.711309659s" podCreationTimestamp="2026-03-13 12:01:08 +0000 UTC" firstStartedPulling="2026-03-13 12:01:09.654948569 +0000 UTC m=+856.934602056" lastFinishedPulling="2026-03-13 12:01:12.231109857 +0000 UTC m=+859.510763344" observedRunningTime="2026-03-13 12:01:12.709682898 +0000 UTC m=+859.989336395" watchObservedRunningTime="2026-03-13 12:01:12.711309659 +0000 UTC m=+859.990963176" Mar 13 12:01:18 crc kubenswrapper[4786]: I0313 12:01:18.585572 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:18 crc kubenswrapper[4786]: I0313 12:01:18.586410 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:18 crc kubenswrapper[4786]: I0313 12:01:18.647049 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:18 crc kubenswrapper[4786]: I0313 12:01:18.775135 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:18 crc kubenswrapper[4786]: I0313 12:01:18.888961 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zsf4r"] Mar 13 12:01:20 crc kubenswrapper[4786]: I0313 12:01:20.738199 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zsf4r" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="registry-server" containerID="cri-o://76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06" gracePeriod=2 Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.149878 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.318825 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-utilities\") pod \"95fda17f-5c22-4321-b98f-c9d17221c3ee\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.318911 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d48d9\" (UniqueName: \"kubernetes.io/projected/95fda17f-5c22-4321-b98f-c9d17221c3ee-kube-api-access-d48d9\") pod \"95fda17f-5c22-4321-b98f-c9d17221c3ee\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.319002 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-catalog-content\") pod \"95fda17f-5c22-4321-b98f-c9d17221c3ee\" (UID: \"95fda17f-5c22-4321-b98f-c9d17221c3ee\") " Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.320444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-utilities" (OuterVolumeSpecName: "utilities") pod "95fda17f-5c22-4321-b98f-c9d17221c3ee" (UID: "95fda17f-5c22-4321-b98f-c9d17221c3ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.322646 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.328368 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fda17f-5c22-4321-b98f-c9d17221c3ee-kube-api-access-d48d9" (OuterVolumeSpecName: "kube-api-access-d48d9") pod "95fda17f-5c22-4321-b98f-c9d17221c3ee" (UID: "95fda17f-5c22-4321-b98f-c9d17221c3ee"). InnerVolumeSpecName "kube-api-access-d48d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.423405 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d48d9\" (UniqueName: \"kubernetes.io/projected/95fda17f-5c22-4321-b98f-c9d17221c3ee-kube-api-access-d48d9\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.746345 4786 generic.go:334] "Generic (PLEG): container finished" podID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerID="76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06" exitCode=0 Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.746391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsf4r" event={"ID":"95fda17f-5c22-4321-b98f-c9d17221c3ee","Type":"ContainerDied","Data":"76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06"} Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.746420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsf4r" event={"ID":"95fda17f-5c22-4321-b98f-c9d17221c3ee","Type":"ContainerDied","Data":"316187aef8e6929fd1870c0fd12b8305605126696701c69a3a431043a3bcc25b"} Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.746440 4786 scope.go:117] "RemoveContainer" containerID="76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.746569 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsf4r" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.768135 4786 scope.go:117] "RemoveContainer" containerID="939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.795616 4786 scope.go:117] "RemoveContainer" containerID="2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.828956 4786 scope.go:117] "RemoveContainer" containerID="76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06" Mar 13 12:01:21 crc kubenswrapper[4786]: E0313 12:01:21.829433 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06\": container with ID starting with 76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06 not found: ID does not exist" containerID="76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.829482 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06"} err="failed to get container status \"76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06\": rpc error: code = NotFound desc = could not find container \"76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06\": container with ID starting with 76f5157d25e434b2a722d5dcb63ac279d2a8b330fbe7076fc1435140401dac06 not found: ID does not exist" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.829511 4786 scope.go:117] "RemoveContainer" containerID="939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902" Mar 13 12:01:21 crc kubenswrapper[4786]: E0313 12:01:21.830524 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902\": container with ID starting with 939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902 not found: ID does not exist" containerID="939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.830563 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902"} err="failed to get container status \"939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902\": rpc error: code = NotFound desc = could not find container \"939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902\": container with ID starting with 939115a75081cb61f52876610088138e752945479fdcb67457ec638592c7b902 not found: ID does not exist" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.830589 4786 scope.go:117] "RemoveContainer" containerID="2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a" Mar 13 12:01:21 crc kubenswrapper[4786]: E0313 12:01:21.830979 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a\": container with ID starting with 2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a not found: ID does not exist" containerID="2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a" Mar 13 12:01:21 crc kubenswrapper[4786]: I0313 12:01:21.831037 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a"} err="failed to get container status \"2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a\": rpc error: code = NotFound desc = could not find container \"2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a\": container with ID starting with 2da29a5611778e2d5aebb0dabe929e6cbceb9d5f65003ad87dfa86a48596727a not found: ID does not exist" Mar 13 12:01:22 crc kubenswrapper[4786]: I0313 12:01:22.377610 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95fda17f-5c22-4321-b98f-c9d17221c3ee" (UID: "95fda17f-5c22-4321-b98f-c9d17221c3ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:22 crc kubenswrapper[4786]: I0313 12:01:22.436170 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95fda17f-5c22-4321-b98f-c9d17221c3ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:22 crc kubenswrapper[4786]: I0313 12:01:22.677291 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zsf4r"] Mar 13 12:01:22 crc kubenswrapper[4786]: I0313 12:01:22.682494 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zsf4r"] Mar 13 12:01:23 crc kubenswrapper[4786]: I0313 12:01:23.452303 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" path="/var/lib/kubelet/pods/95fda17f-5c22-4321-b98f-c9d17221c3ee/volumes" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.894637 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m7mcx"] Mar 13 12:01:28 crc kubenswrapper[4786]: E0313 12:01:28.895516 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="extract-content" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.895544 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="extract-content" Mar 13 12:01:28 crc kubenswrapper[4786]: E0313 12:01:28.895585 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="registry-server" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.895601 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="registry-server" Mar 13 12:01:28 crc kubenswrapper[4786]: E0313 12:01:28.895636 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="extract-utilities" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.895653 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="extract-utilities" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.895863 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fda17f-5c22-4321-b98f-c9d17221c3ee" containerName="registry-server" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.900505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:28 crc kubenswrapper[4786]: I0313 12:01:28.924730 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7mcx"] Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.035464 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-catalog-content\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.035559 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98dvf\" (UniqueName: \"kubernetes.io/projected/219b765a-be88-4b82-92b3-9c6d6e64924f-kube-api-access-98dvf\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.035717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-utilities\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.137650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98dvf\" (UniqueName: \"kubernetes.io/projected/219b765a-be88-4b82-92b3-9c6d6e64924f-kube-api-access-98dvf\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.137734 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-catalog-content\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.137829 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-utilities\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.138418 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-catalog-content\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.138547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-utilities\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.167292 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98dvf\" (UniqueName: \"kubernetes.io/projected/219b765a-be88-4b82-92b3-9c6d6e64924f-kube-api-access-98dvf\") pod \"redhat-operators-m7mcx\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.226666 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.475199 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7mcx"] Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.804869 4786 generic.go:334] "Generic (PLEG): container finished" podID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerID="0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883" exitCode=0 Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.804953 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7mcx" event={"ID":"219b765a-be88-4b82-92b3-9c6d6e64924f","Type":"ContainerDied","Data":"0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883"} Mar 13 12:01:29 crc kubenswrapper[4786]: I0313 12:01:29.805246 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7mcx" event={"ID":"219b765a-be88-4b82-92b3-9c6d6e64924f","Type":"ContainerStarted","Data":"15de6715b4e99d25a994f9282af0830e9208059a651d18e73abd1be494c9f01e"} Mar 13 12:01:31 crc kubenswrapper[4786]: I0313 12:01:31.834998 4786 generic.go:334] "Generic (PLEG): container finished" podID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerID="4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec" exitCode=0 Mar 13 12:01:31 crc kubenswrapper[4786]: I0313 12:01:31.835455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7mcx" event={"ID":"219b765a-be88-4b82-92b3-9c6d6e64924f","Type":"ContainerDied","Data":"4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec"} Mar 13 12:01:32 crc kubenswrapper[4786]: I0313 12:01:32.846051 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7mcx" event={"ID":"219b765a-be88-4b82-92b3-9c6d6e64924f","Type":"ContainerStarted","Data":"d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000"} Mar 13 12:01:32 crc kubenswrapper[4786]: I0313 12:01:32.876967 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m7mcx" podStartSLOduration=2.425641219 podStartE2EDuration="4.876939766s" podCreationTimestamp="2026-03-13 12:01:28 +0000 UTC" firstStartedPulling="2026-03-13 12:01:29.806141557 +0000 UTC m=+877.085795004" lastFinishedPulling="2026-03-13 12:01:32.257440064 +0000 UTC m=+879.537093551" observedRunningTime="2026-03-13 12:01:32.869543698 +0000 UTC m=+880.149197155" watchObservedRunningTime="2026-03-13 12:01:32.876939766 +0000 UTC m=+880.156593283" Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.871115 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4z4th"] Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872135 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-node" containerID="cri-o://692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872141 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872159 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="nbdb" containerID="cri-o://89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872430 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="sbdb" containerID="cri-o://907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872490 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-acl-logging" containerID="cri-o://2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872084 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-controller" containerID="cri-o://1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.872300 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="northd" containerID="cri-o://757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" gracePeriod=30 Mar 13 12:01:35 crc kubenswrapper[4786]: I0313 12:01:35.929773 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" containerID="cri-o://a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" gracePeriod=30 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.310939 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/3.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.313701 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovn-acl-logging/0.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.314550 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovn-controller/0.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.315021 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.386074 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7jmj9"] Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.386523 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-node" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.386596 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-node" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.386653 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.386704 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.386756 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.386812 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.386869 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.386941 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387011 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kubecfg-setup" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387062 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kubecfg-setup" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387144 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387225 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387301 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="sbdb" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387368 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="sbdb" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387434 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="northd" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387492 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="northd" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387554 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387614 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387675 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="nbdb" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387737 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="nbdb" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.387798 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-acl-logging" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.387936 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-acl-logging" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388100 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="sbdb" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388189 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388263 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388333 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388397 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovn-acl-logging" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388460 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="northd" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388518 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388582 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388645 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="kube-rbac-proxy-node" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388711 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="nbdb" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.388775 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.388958 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.389035 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: E0313 12:01:36.389112 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.389189 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.389365 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerName="ovnkube-controller" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.391283 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-kubelet\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-var-lib-openvswitch\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-netns\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-openvswitch\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-ovn-kubernetes\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435489 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-ovn\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-etc-openvswitch\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435210 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435551 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435501 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435597 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435605 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435637 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435638 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-log-socket\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435731 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2cq4\" (UniqueName: \"kubernetes.io/projected/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-kube-api-access-g2cq4\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.435721 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-log-socket" (OuterVolumeSpecName: "log-socket") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436146 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-config\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436207 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-systemd\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436247 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-systemd-units\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovn-node-metrics-cert\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-node-log\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-slash\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436384 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436400 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-env-overrides\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-bin\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436502 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436485 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-slash" (OuterVolumeSpecName: "host-slash") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436489 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-node-log" (OuterVolumeSpecName: "node-log") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436559 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436536 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-script-lib\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-netd\") pod \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\" (UID: \"4fb3555e-af42-44e2-89e8-6f0a8d5d485c\") " Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436800 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436812 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436947 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.436953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-ovnkube-script-lib\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437023 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-ovn\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437063 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4203122c-59fd-42f8-8f9d-0b330961e694-ovn-node-metrics-cert\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437146 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-env-overrides\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-systemd\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437244 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-etc-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-run-ovn-kubernetes\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-kubelet\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlfh\" (UniqueName: \"kubernetes.io/projected/4203122c-59fd-42f8-8f9d-0b330961e694-kube-api-access-wmlfh\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-cni-netd\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437602 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-cni-bin\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-node-log\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437702 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-var-lib-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437753 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-log-socket\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437817 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-ovnkube-config\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437868 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-run-netns\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437954 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-systemd-units\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.437984 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-slash\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438172 4786 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438196 4786 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438214 4786 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438233 4786 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438253 4786 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438270 4786 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438288 4786 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438305 4786 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438322 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438338 4786 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438354 4786 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438370 4786 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438386 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438404 4786 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438430 4786 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438458 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.438482 4786 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.442127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-kube-api-access-g2cq4" (OuterVolumeSpecName: "kube-api-access-g2cq4") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "kube-api-access-g2cq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.443911 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.459802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4fb3555e-af42-44e2-89e8-6f0a8d5d485c" (UID: "4fb3555e-af42-44e2-89e8-6f0a8d5d485c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540316 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-log-socket\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540395 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-ovnkube-config\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-run-netns\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540470 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-systemd-units\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540482 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-log-socket\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-slash\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540541 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-run-netns\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-ovnkube-script-lib\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540596 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-ovn\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4203122c-59fd-42f8-8f9d-0b330961e694-ovn-node-metrics-cert\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540663 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-systemd\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540692 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-env-overrides\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-etc-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-run-ovn-kubernetes\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540836 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540867 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-kubelet\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540924 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlfh\" (UniqueName: \"kubernetes.io/projected/4203122c-59fd-42f8-8f9d-0b330961e694-kube-api-access-wmlfh\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-cni-netd\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.540986 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-cni-bin\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541034 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-node-log\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541071 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-var-lib-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541144 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2cq4\" (UniqueName: \"kubernetes.io/projected/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-kube-api-access-g2cq4\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541165 4786 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541183 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb3555e-af42-44e2-89e8-6f0a8d5d485c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-etc-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541237 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-var-lib-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-systemd-units\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-openvswitch\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-slash\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-run-ovn-kubernetes\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-kubelet\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541555 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-ovnkube-config\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541847 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-cni-netd\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-host-cni-bin\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.541973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-node-log\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.542029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-systemd\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.542070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4203122c-59fd-42f8-8f9d-0b330961e694-run-ovn\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.542291 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-ovnkube-script-lib\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.542971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4203122c-59fd-42f8-8f9d-0b330961e694-env-overrides\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.547144 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4203122c-59fd-42f8-8f9d-0b330961e694-ovn-node-metrics-cert\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.562191 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlfh\" (UniqueName: \"kubernetes.io/projected/4203122c-59fd-42f8-8f9d-0b330961e694-kube-api-access-wmlfh\") pod \"ovnkube-node-7jmj9\" (UID: \"4203122c-59fd-42f8-8f9d-0b330961e694\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.716022 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:36 crc kubenswrapper[4786]: W0313 12:01:36.743049 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4203122c_59fd_42f8_8f9d_0b330961e694.slice/crio-c400483f5ca0431a16ee0433ba0e97f86f8b56687acbb43950164b6bdc954af2 WatchSource:0}: Error finding container c400483f5ca0431a16ee0433ba0e97f86f8b56687acbb43950164b6bdc954af2: Status 404 returned error can't find the container with id c400483f5ca0431a16ee0433ba0e97f86f8b56687acbb43950164b6bdc954af2 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.883504 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/2.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.884657 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/1.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.884718 4786 generic.go:334] "Generic (PLEG): container finished" podID="cd2e61d0-5deb-4005-85b4-c6f5ae70fe62" containerID="6827a12fecc9b0287ae0b64a23d85b0319b84398bbfce6c8aa49249074ac5ff4" exitCode=2 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.884784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerDied","Data":"6827a12fecc9b0287ae0b64a23d85b0319b84398bbfce6c8aa49249074ac5ff4"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.884849 4786 scope.go:117] "RemoveContainer" containerID="e01c4d76d083eea20bcfbe41083fe7f664c0937caa3a838b7170306b063b1dec" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.885384 4786 scope.go:117] "RemoveContainer" containerID="6827a12fecc9b0287ae0b64a23d85b0319b84398bbfce6c8aa49249074ac5ff4" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.890388 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"c400483f5ca0431a16ee0433ba0e97f86f8b56687acbb43950164b6bdc954af2"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.898176 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovnkube-controller/3.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.901667 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovn-acl-logging/0.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.904918 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4z4th_4fb3555e-af42-44e2-89e8-6f0a8d5d485c/ovn-controller/0.log" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905366 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" exitCode=0 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905415 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" exitCode=0 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905429 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" exitCode=0 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905441 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" exitCode=0 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905451 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" exitCode=0 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905460 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" exitCode=0 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905469 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" exitCode=143 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905479 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" exitCode=143 Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905511 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905575 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905603 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905629 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905641 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905649 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905657 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905666 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905673 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905681 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905688 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905695 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905704 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905725 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905734 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905741 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905748 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905756 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905763 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905770 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905777 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905784 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905790 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905800 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905813 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905826 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905846 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905858 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905868 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905877 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905912 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905922 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905930 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905937 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905949 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" event={"ID":"4fb3555e-af42-44e2-89e8-6f0a8d5d485c","Type":"ContainerDied","Data":"d863342be7ec5832f97184ccdf598c1c1277c5dccece1a4c79732c4f8bbee786"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905964 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905973 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905980 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905988 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.905995 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.906002 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.906009 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.906016 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.906023 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.906030 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.906127 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4z4th" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.920694 4786 scope.go:117] "RemoveContainer" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.949562 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.966719 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4z4th"] Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.970904 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4z4th"] Mar 13 12:01:36 crc kubenswrapper[4786]: I0313 12:01:36.974536 4786 scope.go:117] "RemoveContainer" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.069915 4786 scope.go:117] "RemoveContainer" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.096136 4786 scope.go:117] "RemoveContainer" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.112740 4786 scope.go:117] "RemoveContainer" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.124252 4786 scope.go:117] "RemoveContainer" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.150989 4786 scope.go:117] "RemoveContainer" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.164332 4786 scope.go:117] "RemoveContainer" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.183850 4786 scope.go:117] "RemoveContainer" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.205380 4786 scope.go:117] "RemoveContainer" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.205822 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": container with ID starting with a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566 not found: ID does not exist" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.205908 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} err="failed to get container status \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": rpc error: code = NotFound desc = could not find container \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": container with ID starting with a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.205948 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.206566 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": container with ID starting with 6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272 not found: ID does not exist" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.206610 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} err="failed to get container status \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": rpc error: code = NotFound desc = could not find container \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": container with ID starting with 6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.206656 4786 scope.go:117] "RemoveContainer" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.207075 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": container with ID starting with 907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489 not found: ID does not exist" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207106 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} err="failed to get container status \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": rpc error: code = NotFound desc = could not find container \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": container with ID starting with 907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207129 4786 scope.go:117] "RemoveContainer" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.207346 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": container with ID starting with 89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337 not found: ID does not exist" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207367 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} err="failed to get container status \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": rpc error: code = NotFound desc = could not find container \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": container with ID starting with 89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207385 4786 scope.go:117] "RemoveContainer" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.207582 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": container with ID starting with 757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323 not found: ID does not exist" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207602 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} err="failed to get container status \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": rpc error: code = NotFound desc = could not find container \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": container with ID starting with 757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207615 4786 scope.go:117] "RemoveContainer" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.207844 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": container with ID starting with 0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d not found: ID does not exist" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207893 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} err="failed to get container status \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": rpc error: code = NotFound desc = could not find container \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": container with ID starting with 0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.207923 4786 scope.go:117] "RemoveContainer" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.208129 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": container with ID starting with 692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082 not found: ID does not exist" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.208155 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} err="failed to get container status \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": rpc error: code = NotFound desc = could not find container \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": container with ID starting with 692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.208170 4786 scope.go:117] "RemoveContainer" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.208396 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": container with ID starting with 2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c not found: ID does not exist" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.208411 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} err="failed to get container status \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": rpc error: code = NotFound desc = could not find container \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": container with ID starting with 2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.208425 4786 scope.go:117] "RemoveContainer" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.208630 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": container with ID starting with 1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5 not found: ID does not exist" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.208665 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} err="failed to get container status \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": rpc error: code = NotFound desc = could not find container \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": container with ID starting with 1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.208690 4786 scope.go:117] "RemoveContainer" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" Mar 13 12:01:37 crc kubenswrapper[4786]: E0313 12:01:37.209164 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": container with ID starting with c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f not found: ID does not exist" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.209186 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} err="failed to get container status \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": rpc error: code = NotFound desc = could not find container \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": container with ID starting with c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.209200 4786 scope.go:117] "RemoveContainer" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.209729 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} err="failed to get container status \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": rpc error: code = NotFound desc = could not find container \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": container with ID starting with a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.209757 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.210110 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} err="failed to get container status \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": rpc error: code = NotFound desc = could not find container \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": container with ID starting with 6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.210149 4786 scope.go:117] "RemoveContainer" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.210412 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} err="failed to get container status \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": rpc error: code = NotFound desc = could not find container \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": container with ID starting with 907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.210459 4786 scope.go:117] "RemoveContainer" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.210701 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} err="failed to get container status \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": rpc error: code = NotFound desc = could not find container \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": container with ID starting with 89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.210725 4786 scope.go:117] "RemoveContainer" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.211176 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} err="failed to get container status \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": rpc error: code = NotFound desc = could not find container \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": container with ID starting with 757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.211213 4786 scope.go:117] "RemoveContainer" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.211482 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} err="failed to get container status \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": rpc error: code = NotFound desc = could not find container \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": container with ID starting with 0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.211506 4786 scope.go:117] "RemoveContainer" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.211789 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} err="failed to get container status \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": rpc error: code = NotFound desc = could not find container \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": container with ID starting with 692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.211820 4786 scope.go:117] "RemoveContainer" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212093 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} err="failed to get container status \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": rpc error: code = NotFound desc = could not find container \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": container with ID starting with 2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212118 4786 scope.go:117] "RemoveContainer" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212377 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} err="failed to get container status \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": rpc error: code = NotFound desc = could not find container \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": container with ID starting with 1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212412 4786 scope.go:117] "RemoveContainer" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212650 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} err="failed to get container status \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": rpc error: code = NotFound desc = could not find container \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": container with ID starting with c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212675 4786 scope.go:117] "RemoveContainer" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212948 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} err="failed to get container status \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": rpc error: code = NotFound desc = could not find container \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": container with ID starting with a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.212986 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.213304 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} err="failed to get container status \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": rpc error: code = NotFound desc = could not find container \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": container with ID starting with 6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.213359 4786 scope.go:117] "RemoveContainer" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.213663 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} err="failed to get container status \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": rpc error: code = NotFound desc = could not find container \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": container with ID starting with 907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.213695 4786 scope.go:117] "RemoveContainer" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.214279 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} err="failed to get container status \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": rpc error: code = NotFound desc = could not find container \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": container with ID starting with 89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.214304 4786 scope.go:117] "RemoveContainer" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.214633 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} err="failed to get container status \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": rpc error: code = NotFound desc = could not find container \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": container with ID starting with 757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.214666 4786 scope.go:117] "RemoveContainer" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.214930 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} err="failed to get container status \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": rpc error: code = NotFound desc = could not find container \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": container with ID starting with 0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.214961 4786 scope.go:117] "RemoveContainer" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.215225 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} err="failed to get container status \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": rpc error: code = NotFound desc = could not find container \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": container with ID starting with 692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.215295 4786 scope.go:117] "RemoveContainer" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.215515 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} err="failed to get container status \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": rpc error: code = NotFound desc = could not find container \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": container with ID starting with 2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.215547 4786 scope.go:117] "RemoveContainer" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.215775 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} err="failed to get container status \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": rpc error: code = NotFound desc = could not find container \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": container with ID starting with 1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.215803 4786 scope.go:117] "RemoveContainer" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.216022 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} err="failed to get container status \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": rpc error: code = NotFound desc = could not find container \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": container with ID starting with c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.216046 4786 scope.go:117] "RemoveContainer" containerID="a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.216352 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566"} err="failed to get container status \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": rpc error: code = NotFound desc = could not find container \"a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566\": container with ID starting with a24b4faceaa4c0faddffb69edd4e1100379d270c441e3da5b05a2bb726948566 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.216396 4786 scope.go:117] "RemoveContainer" containerID="6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.216796 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272"} err="failed to get container status \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": rpc error: code = NotFound desc = could not find container \"6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272\": container with ID starting with 6c1c8717e1df5f82b5f3d04b0ce4b8f17b99f8bf9c290634d67813352eae7272 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.216829 4786 scope.go:117] "RemoveContainer" containerID="907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.217226 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489"} err="failed to get container status \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": rpc error: code = NotFound desc = could not find container \"907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489\": container with ID starting with 907724965c365102591aa166a95848bc25d952dd336dcb1b8fcb0063bde24489 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.217255 4786 scope.go:117] "RemoveContainer" containerID="89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.221395 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337"} err="failed to get container status \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": rpc error: code = NotFound desc = could not find container \"89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337\": container with ID starting with 89289d604e6ded772f96cb47a15f017613422574238fe443d8cb596702710337 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.221446 4786 scope.go:117] "RemoveContainer" containerID="757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.224530 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323"} err="failed to get container status \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": rpc error: code = NotFound desc = could not find container \"757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323\": container with ID starting with 757af434ce357b8278a8a47cf4c50495c6092f2e1b8ccd019c28cc159c379323 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.224577 4786 scope.go:117] "RemoveContainer" containerID="0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.225379 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d"} err="failed to get container status \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": rpc error: code = NotFound desc = could not find container \"0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d\": container with ID starting with 0abf4f678509017e3d593d7e9695ad6ac84b314f48f49abf96d9c5d3574a5c9d not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.225415 4786 scope.go:117] "RemoveContainer" containerID="692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.225934 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082"} err="failed to get container status \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": rpc error: code = NotFound desc = could not find container \"692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082\": container with ID starting with 692970b1a593f57eb13b4d00471f6e50093e4e1e6c17922530d39e04df509082 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.225977 4786 scope.go:117] "RemoveContainer" containerID="2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.226740 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c"} err="failed to get container status \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": rpc error: code = NotFound desc = could not find container \"2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c\": container with ID starting with 2c2d848ad70f15da70579ebdfe6840ef11a5d5139abe65cc87bb7523f241ae5c not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.226788 4786 scope.go:117] "RemoveContainer" containerID="1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.227078 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5"} err="failed to get container status \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": rpc error: code = NotFound desc = could not find container \"1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5\": container with ID starting with 1e588541332422cc8142082bf49910af6db7cb5555d6f3a806b0a1381c57aad5 not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.227099 4786 scope.go:117] "RemoveContainer" containerID="c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.227615 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f"} err="failed to get container status \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": rpc error: code = NotFound desc = could not find container \"c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f\": container with ID starting with c9906eb94a7bbe00bb1194694b8d626a00a92885428a1ce7bf67ae0014c6dc1f not found: ID does not exist" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.451047 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb3555e-af42-44e2-89e8-6f0a8d5d485c" path="/var/lib/kubelet/pods/4fb3555e-af42-44e2-89e8-6f0a8d5d485c/volumes" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.913966 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b5xwr_cd2e61d0-5deb-4005-85b4-c6f5ae70fe62/kube-multus/2.log" Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.914099 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b5xwr" event={"ID":"cd2e61d0-5deb-4005-85b4-c6f5ae70fe62","Type":"ContainerStarted","Data":"a5ccce026395d1ef05bd7d3c868d55daab11563b2f50c30269320c2b2f6c3831"} Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.915713 4786 generic.go:334] "Generic (PLEG): container finished" podID="4203122c-59fd-42f8-8f9d-0b330961e694" containerID="d16f58e10046b4d8de7495d93b4ded5d026f78aff925d6ace2c1e2e8f16b8173" exitCode=0 Mar 13 12:01:37 crc kubenswrapper[4786]: I0313 12:01:37.915824 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerDied","Data":"d16f58e10046b4d8de7495d93b4ded5d026f78aff925d6ace2c1e2e8f16b8173"} Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.169530 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.169636 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.169713 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.170635 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53f9a9165f399ca75a5c5e665434b0714c4c497324b97b5da97227bbf25aa5b5"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.170766 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://53f9a9165f399ca75a5c5e665434b0714c4c497324b97b5da97227bbf25aa5b5" gracePeriod=600 Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.927044 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="53f9a9165f399ca75a5c5e665434b0714c4c497324b97b5da97227bbf25aa5b5" exitCode=0 Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.927108 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"53f9a9165f399ca75a5c5e665434b0714c4c497324b97b5da97227bbf25aa5b5"} Mar 13 12:01:38 crc kubenswrapper[4786]: I0313 12:01:38.927577 4786 scope.go:117] "RemoveContainer" containerID="547b00ed88c3949b89b12da8316f4066352f35b5af45e0f69411a7f36910a357" Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.227747 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.227850 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.933777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"cee9aff52905686331ac0d49b868be713596890b00b0633ce66e8cdee6b5f0de"} Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.938541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"fc7c020caa34e2de786651916965c47a3bed0a8c6eb8e3865e9c72185b6bb3be"} Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.938644 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"adcd0d5584c7dc032d2019056932c2379def38ec965c60bbb1e399fbdf41bcec"} Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.938702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"60d31ce505158d6d8967fc696c219f40ae3d1ad9ea633a7bc4ac25d61b21a600"} Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.938772 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"4a173269f5aba54b5b01794006b6f7b64401775598adc01514528dd8138eb1c8"} Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.938849 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"eb766310db7388906ba2ac869586f5d5cb5802bf3360c398386a4617ec16ae0d"} Mar 13 12:01:39 crc kubenswrapper[4786]: I0313 12:01:39.938944 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"17004991e9ca38f67e1bf8e9dae477b641a695e522c21681b36472ed54df4f83"} Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.296200 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m7mcx" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="registry-server" probeResult="failure" output=< Mar 13 12:01:40 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:01:40 crc kubenswrapper[4786]: > Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.425632 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hbx9j"] Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.427555 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.493237 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-catalog-content\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.493405 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkppz\" (UniqueName: \"kubernetes.io/projected/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-kube-api-access-qkppz\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.493668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-utilities\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.594795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-utilities\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.594903 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-catalog-content\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.594968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkppz\" (UniqueName: \"kubernetes.io/projected/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-kube-api-access-qkppz\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.595691 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-utilities\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.595702 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-catalog-content\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.615302 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkppz\" (UniqueName: \"kubernetes.io/projected/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-kube-api-access-qkppz\") pod \"certified-operators-hbx9j\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: I0313 12:01:40.744592 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: E0313 12:01:40.785117 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(e986bb53ab39e3e4e45dc30d875feac25c3762862a3fd08340670a3a520cd0c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:01:40 crc kubenswrapper[4786]: E0313 12:01:40.785214 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(e986bb53ab39e3e4e45dc30d875feac25c3762862a3fd08340670a3a520cd0c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: E0313 12:01:40.785247 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(e986bb53ab39e3e4e45dc30d875feac25c3762862a3fd08340670a3a520cd0c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:40 crc kubenswrapper[4786]: E0313 12:01:40.785306 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-hbx9j_openshift-marketplace(4137de19-8f78-4d95-bfa9-eaad93f1cbfb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-hbx9j_openshift-marketplace(4137de19-8f78-4d95-bfa9-eaad93f1cbfb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(e986bb53ab39e3e4e45dc30d875feac25c3762862a3fd08340670a3a520cd0c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-hbx9j" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.306380 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bx2f9"] Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.307290 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.309442 4786 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-86xh9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.309734 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.310116 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.310479 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.403820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-node-mnt\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.403871 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pft7c\" (UniqueName: \"kubernetes.io/projected/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-kube-api-access-pft7c\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.403938 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-crc-storage\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.505064 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pft7c\" (UniqueName: \"kubernetes.io/projected/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-kube-api-access-pft7c\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.505162 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-crc-storage\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.505206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-node-mnt\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.505448 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-node-mnt\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.506471 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-crc-storage\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.525211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pft7c\" (UniqueName: \"kubernetes.io/projected/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-kube-api-access-pft7c\") pod \"crc-storage-crc-bx2f9\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: I0313 12:01:41.627135 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: E0313 12:01:41.654699 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(5545aa595957a1d9d6560f09eabfe3add0615a3dbc2ceee36873a82debe66926): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:01:41 crc kubenswrapper[4786]: E0313 12:01:41.654864 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(5545aa595957a1d9d6560f09eabfe3add0615a3dbc2ceee36873a82debe66926): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: E0313 12:01:41.655019 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(5545aa595957a1d9d6560f09eabfe3add0615a3dbc2ceee36873a82debe66926): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:41 crc kubenswrapper[4786]: E0313 12:01:41.655166 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bx2f9_crc-storage(a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bx2f9_crc-storage(a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(5545aa595957a1d9d6560f09eabfe3add0615a3dbc2ceee36873a82debe66926): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bx2f9" podUID="a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" Mar 13 12:01:42 crc kubenswrapper[4786]: I0313 12:01:42.965605 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"b937ef8452044620ee73e50d4c238f237170bc5f29726a398d542a7098a3ba98"} Mar 13 12:01:44 crc kubenswrapper[4786]: I0313 12:01:44.981441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" event={"ID":"4203122c-59fd-42f8-8f9d-0b330961e694","Type":"ContainerStarted","Data":"2a947a7a2f534b35367fa7da1bddd2b0f7229a6204bc30b262403cb3a8b6889c"} Mar 13 12:01:44 crc kubenswrapper[4786]: I0313 12:01:44.981999 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:44 crc kubenswrapper[4786]: I0313 12:01:44.982016 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:44 crc kubenswrapper[4786]: I0313 12:01:44.982027 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.017770 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.022209 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.031240 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" podStartSLOduration=9.03121175 podStartE2EDuration="9.03121175s" podCreationTimestamp="2026-03-13 12:01:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:01:45.027397582 +0000 UTC m=+892.307051099" watchObservedRunningTime="2026-03-13 12:01:45.03121175 +0000 UTC m=+892.310865237" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.323943 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hbx9j"] Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.324171 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.324853 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.335901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bx2f9"] Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.336042 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:45 crc kubenswrapper[4786]: I0313 12:01:45.336528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.375152 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(eade1e4a9eb35e62bdaa945cc815a44043435f5f998344002f6b1644c8ad7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.375453 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(eade1e4a9eb35e62bdaa945cc815a44043435f5f998344002f6b1644c8ad7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.375478 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(eade1e4a9eb35e62bdaa945cc815a44043435f5f998344002f6b1644c8ad7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.375521 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-hbx9j_openshift-marketplace(4137de19-8f78-4d95-bfa9-eaad93f1cbfb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-hbx9j_openshift-marketplace(4137de19-8f78-4d95-bfa9-eaad93f1cbfb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-hbx9j_openshift-marketplace_4137de19-8f78-4d95-bfa9-eaad93f1cbfb_0(eade1e4a9eb35e62bdaa945cc815a44043435f5f998344002f6b1644c8ad7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-hbx9j" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.391698 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(9ce024114fbd5e573d2cd333646dec1d779f3fb60e8f417764c76d88c81aaa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.391766 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(9ce024114fbd5e573d2cd333646dec1d779f3fb60e8f417764c76d88c81aaa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.391794 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(9ce024114fbd5e573d2cd333646dec1d779f3fb60e8f417764c76d88c81aaa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:45 crc kubenswrapper[4786]: E0313 12:01:45.391839 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bx2f9_crc-storage(a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bx2f9_crc-storage(a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bx2f9_crc-storage_a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae_0(9ce024114fbd5e573d2cd333646dec1d779f3fb60e8f417764c76d88c81aaa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bx2f9" podUID="a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" Mar 13 12:01:49 crc kubenswrapper[4786]: I0313 12:01:49.286546 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:49 crc kubenswrapper[4786]: I0313 12:01:49.350821 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:49 crc kubenswrapper[4786]: I0313 12:01:49.535875 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7mcx"] Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.036469 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m7mcx" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="registry-server" containerID="cri-o://d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000" gracePeriod=2 Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.502685 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.553295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-catalog-content\") pod \"219b765a-be88-4b82-92b3-9c6d6e64924f\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.553466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-utilities\") pod \"219b765a-be88-4b82-92b3-9c6d6e64924f\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.553549 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98dvf\" (UniqueName: \"kubernetes.io/projected/219b765a-be88-4b82-92b3-9c6d6e64924f-kube-api-access-98dvf\") pod \"219b765a-be88-4b82-92b3-9c6d6e64924f\" (UID: \"219b765a-be88-4b82-92b3-9c6d6e64924f\") " Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.555201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-utilities" (OuterVolumeSpecName: "utilities") pod "219b765a-be88-4b82-92b3-9c6d6e64924f" (UID: "219b765a-be88-4b82-92b3-9c6d6e64924f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.582628 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219b765a-be88-4b82-92b3-9c6d6e64924f-kube-api-access-98dvf" (OuterVolumeSpecName: "kube-api-access-98dvf") pod "219b765a-be88-4b82-92b3-9c6d6e64924f" (UID: "219b765a-be88-4b82-92b3-9c6d6e64924f"). InnerVolumeSpecName "kube-api-access-98dvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.655433 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.655487 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98dvf\" (UniqueName: \"kubernetes.io/projected/219b765a-be88-4b82-92b3-9c6d6e64924f-kube-api-access-98dvf\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.714157 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "219b765a-be88-4b82-92b3-9c6d6e64924f" (UID: "219b765a-be88-4b82-92b3-9c6d6e64924f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:51 crc kubenswrapper[4786]: I0313 12:01:51.757082 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219b765a-be88-4b82-92b3-9c6d6e64924f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.063725 4786 generic.go:334] "Generic (PLEG): container finished" podID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerID="d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000" exitCode=0 Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.063798 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7mcx" event={"ID":"219b765a-be88-4b82-92b3-9c6d6e64924f","Type":"ContainerDied","Data":"d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000"} Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.063837 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7mcx" event={"ID":"219b765a-be88-4b82-92b3-9c6d6e64924f","Type":"ContainerDied","Data":"15de6715b4e99d25a994f9282af0830e9208059a651d18e73abd1be494c9f01e"} Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.063865 4786 scope.go:117] "RemoveContainer" containerID="d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.064422 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7mcx" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.089494 4786 scope.go:117] "RemoveContainer" containerID="4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.123632 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7mcx"] Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.123689 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m7mcx"] Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.131994 4786 scope.go:117] "RemoveContainer" containerID="0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.153448 4786 scope.go:117] "RemoveContainer" containerID="d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000" Mar 13 12:01:52 crc kubenswrapper[4786]: E0313 12:01:52.154112 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000\": container with ID starting with d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000 not found: ID does not exist" containerID="d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.154147 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000"} err="failed to get container status \"d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000\": rpc error: code = NotFound desc = could not find container \"d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000\": container with ID starting with d262f8976d200c0ba51cc5399d865185c23aae00dd03068e745f51b87d11b000 not found: ID does not exist" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.154169 4786 scope.go:117] "RemoveContainer" containerID="4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec" Mar 13 12:01:52 crc kubenswrapper[4786]: E0313 12:01:52.155125 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec\": container with ID starting with 4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec not found: ID does not exist" containerID="4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.155156 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec"} err="failed to get container status \"4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec\": rpc error: code = NotFound desc = could not find container \"4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec\": container with ID starting with 4c566eaa81b653ef86cf21f551166750ecf1fc72526d7c180a6a96c88fe130ec not found: ID does not exist" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.155173 4786 scope.go:117] "RemoveContainer" containerID="0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883" Mar 13 12:01:52 crc kubenswrapper[4786]: E0313 12:01:52.155506 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883\": container with ID starting with 0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883 not found: ID does not exist" containerID="0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883" Mar 13 12:01:52 crc kubenswrapper[4786]: I0313 12:01:52.155533 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883"} err="failed to get container status \"0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883\": rpc error: code = NotFound desc = could not find container \"0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883\": container with ID starting with 0dfc0a2a61c028812565e50f3815c98b6bf6304525bbd8f1f828e3e0aab2f883 not found: ID does not exist" Mar 13 12:01:53 crc kubenswrapper[4786]: I0313 12:01:53.449066 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" path="/var/lib/kubelet/pods/219b765a-be88-4b82-92b3-9c6d6e64924f/volumes" Mar 13 12:01:55 crc kubenswrapper[4786]: I0313 12:01:55.440124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:55 crc kubenswrapper[4786]: I0313 12:01:55.440655 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:01:55 crc kubenswrapper[4786]: I0313 12:01:55.915964 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hbx9j"] Mar 13 12:01:56 crc kubenswrapper[4786]: I0313 12:01:56.097149 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerStarted","Data":"0c409fd66bbbc22e62c7cbabfbeaf11b479f564976ec0d70685d28a1865ed700"} Mar 13 12:01:57 crc kubenswrapper[4786]: I0313 12:01:57.112959 4786 generic.go:334] "Generic (PLEG): container finished" podID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerID="4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572" exitCode=0 Mar 13 12:01:57 crc kubenswrapper[4786]: I0313 12:01:57.113190 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerDied","Data":"4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572"} Mar 13 12:01:58 crc kubenswrapper[4786]: I0313 12:01:58.125803 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerStarted","Data":"7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105"} Mar 13 12:01:59 crc kubenswrapper[4786]: I0313 12:01:59.137525 4786 generic.go:334] "Generic (PLEG): container finished" podID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerID="7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105" exitCode=0 Mar 13 12:01:59 crc kubenswrapper[4786]: I0313 12:01:59.137587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerDied","Data":"7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105"} Mar 13 12:01:59 crc kubenswrapper[4786]: I0313 12:01:59.439947 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:59 crc kubenswrapper[4786]: I0313 12:01:59.440690 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:01:59 crc kubenswrapper[4786]: I0313 12:01:59.909592 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bx2f9"] Mar 13 12:01:59 crc kubenswrapper[4786]: W0313 12:01:59.919109 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda43fae39_7b51_4cc4_bf5e_68fb8cebd4ae.slice/crio-ba20105cea6b2c43d1670501227f8bc99a25c130461d9e2eb6e25380b01c47ae WatchSource:0}: Error finding container ba20105cea6b2c43d1670501227f8bc99a25c130461d9e2eb6e25380b01c47ae: Status 404 returned error can't find the container with id ba20105cea6b2c43d1670501227f8bc99a25c130461d9e2eb6e25380b01c47ae Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.140967 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556722-4dhwg"] Mar 13 12:02:00 crc kubenswrapper[4786]: E0313 12:02:00.142207 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="registry-server" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.142248 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="registry-server" Mar 13 12:02:00 crc kubenswrapper[4786]: E0313 12:02:00.142487 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="extract-utilities" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.142516 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="extract-utilities" Mar 13 12:02:00 crc kubenswrapper[4786]: E0313 12:02:00.142610 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="extract-content" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.142627 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="extract-content" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.142814 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="219b765a-be88-4b82-92b3-9c6d6e64924f" containerName="registry-server" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.143488 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.146049 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.150283 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.152656 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.156638 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerStarted","Data":"465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d"} Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.158477 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-4dhwg"] Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.161071 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bx2f9" event={"ID":"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae","Type":"ContainerStarted","Data":"ba20105cea6b2c43d1670501227f8bc99a25c130461d9e2eb6e25380b01c47ae"} Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.174541 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgfs\" (UniqueName: \"kubernetes.io/projected/af93e5fc-a1db-44c4-aecb-db9648c603ab-kube-api-access-9xgfs\") pod \"auto-csr-approver-29556722-4dhwg\" (UID: \"af93e5fc-a1db-44c4-aecb-db9648c603ab\") " pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.197509 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hbx9j" podStartSLOduration=17.720805655 podStartE2EDuration="20.197483021s" podCreationTimestamp="2026-03-13 12:01:40 +0000 UTC" firstStartedPulling="2026-03-13 12:01:57.116253014 +0000 UTC m=+904.395906491" lastFinishedPulling="2026-03-13 12:01:59.59293038 +0000 UTC m=+906.872583857" observedRunningTime="2026-03-13 12:02:00.19315149 +0000 UTC m=+907.472805017" watchObservedRunningTime="2026-03-13 12:02:00.197483021 +0000 UTC m=+907.477136508" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.276911 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgfs\" (UniqueName: \"kubernetes.io/projected/af93e5fc-a1db-44c4-aecb-db9648c603ab-kube-api-access-9xgfs\") pod \"auto-csr-approver-29556722-4dhwg\" (UID: \"af93e5fc-a1db-44c4-aecb-db9648c603ab\") " pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.311291 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgfs\" (UniqueName: \"kubernetes.io/projected/af93e5fc-a1db-44c4-aecb-db9648c603ab-kube-api-access-9xgfs\") pod \"auto-csr-approver-29556722-4dhwg\" (UID: \"af93e5fc-a1db-44c4-aecb-db9648c603ab\") " pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.488951 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.745854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.745956 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:02:00 crc kubenswrapper[4786]: I0313 12:02:00.963452 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-4dhwg"] Mar 13 12:02:01 crc kubenswrapper[4786]: I0313 12:02:01.797083 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hbx9j" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="registry-server" probeResult="failure" output=< Mar 13 12:02:01 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:02:01 crc kubenswrapper[4786]: > Mar 13 12:02:02 crc kubenswrapper[4786]: I0313 12:02:02.178962 4786 generic.go:334] "Generic (PLEG): container finished" podID="a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" containerID="2075c6f63dbcf2ff2c24aa36b4b5cdccc48c5f3b467478f578246b0875d59573" exitCode=0 Mar 13 12:02:02 crc kubenswrapper[4786]: I0313 12:02:02.179137 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bx2f9" event={"ID":"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae","Type":"ContainerDied","Data":"2075c6f63dbcf2ff2c24aa36b4b5cdccc48c5f3b467478f578246b0875d59573"} Mar 13 12:02:02 crc kubenswrapper[4786]: I0313 12:02:02.181990 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" event={"ID":"af93e5fc-a1db-44c4-aecb-db9648c603ab","Type":"ContainerStarted","Data":"2cdaf89ed325c337f86acb978ae192a394575b3f35aaa80cfaaa2282d88c53a3"} Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.195126 4786 generic.go:334] "Generic (PLEG): container finished" podID="af93e5fc-a1db-44c4-aecb-db9648c603ab" containerID="3412e6946ef7f2fe51beeb053507012c5963bddc334090e08cf9dbf529ca8bb5" exitCode=0 Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.195215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" event={"ID":"af93e5fc-a1db-44c4-aecb-db9648c603ab","Type":"ContainerDied","Data":"3412e6946ef7f2fe51beeb053507012c5963bddc334090e08cf9dbf529ca8bb5"} Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.471472 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.520873 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pft7c\" (UniqueName: \"kubernetes.io/projected/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-kube-api-access-pft7c\") pod \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.521072 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-crc-storage\") pod \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.521141 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-node-mnt\") pod \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\" (UID: \"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae\") " Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.521294 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" (UID: "a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.521806 4786 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.532406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-kube-api-access-pft7c" (OuterVolumeSpecName: "kube-api-access-pft7c") pod "a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" (UID: "a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae"). InnerVolumeSpecName "kube-api-access-pft7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.546976 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" (UID: "a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.622848 4786 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:03 crc kubenswrapper[4786]: I0313 12:02:03.622969 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pft7c\" (UniqueName: \"kubernetes.io/projected/a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae-kube-api-access-pft7c\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.205684 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bx2f9" event={"ID":"a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae","Type":"ContainerDied","Data":"ba20105cea6b2c43d1670501227f8bc99a25c130461d9e2eb6e25380b01c47ae"} Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.206193 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba20105cea6b2c43d1670501227f8bc99a25c130461d9e2eb6e25380b01c47ae" Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.205729 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bx2f9" Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.554651 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.636651 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xgfs\" (UniqueName: \"kubernetes.io/projected/af93e5fc-a1db-44c4-aecb-db9648c603ab-kube-api-access-9xgfs\") pod \"af93e5fc-a1db-44c4-aecb-db9648c603ab\" (UID: \"af93e5fc-a1db-44c4-aecb-db9648c603ab\") " Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.642685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af93e5fc-a1db-44c4-aecb-db9648c603ab-kube-api-access-9xgfs" (OuterVolumeSpecName: "kube-api-access-9xgfs") pod "af93e5fc-a1db-44c4-aecb-db9648c603ab" (UID: "af93e5fc-a1db-44c4-aecb-db9648c603ab"). InnerVolumeSpecName "kube-api-access-9xgfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:04 crc kubenswrapper[4786]: I0313 12:02:04.738486 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xgfs\" (UniqueName: \"kubernetes.io/projected/af93e5fc-a1db-44c4-aecb-db9648c603ab-kube-api-access-9xgfs\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:05 crc kubenswrapper[4786]: I0313 12:02:05.217406 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" event={"ID":"af93e5fc-a1db-44c4-aecb-db9648c603ab","Type":"ContainerDied","Data":"2cdaf89ed325c337f86acb978ae192a394575b3f35aaa80cfaaa2282d88c53a3"} Mar 13 12:02:05 crc kubenswrapper[4786]: I0313 12:02:05.217462 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cdaf89ed325c337f86acb978ae192a394575b3f35aaa80cfaaa2282d88c53a3" Mar 13 12:02:05 crc kubenswrapper[4786]: I0313 12:02:05.217475 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-4dhwg" Mar 13 12:02:05 crc kubenswrapper[4786]: I0313 12:02:05.632915 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-qc8sr"] Mar 13 12:02:05 crc kubenswrapper[4786]: I0313 12:02:05.638166 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-qc8sr"] Mar 13 12:02:06 crc kubenswrapper[4786]: I0313 12:02:06.757204 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7jmj9" Mar 13 12:02:07 crc kubenswrapper[4786]: I0313 12:02:07.451480 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd421e3-789b-475c-9a4d-b20bc95f15bf" path="/var/lib/kubelet/pods/cbd421e3-789b-475c-9a4d-b20bc95f15bf/volumes" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.131144 4786 scope.go:117] "RemoveContainer" containerID="2f3e95662904899e924a6719689e2bed3be873fcc70e0d2706572a2224ae5d93" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.207215 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb"] Mar 13 12:02:10 crc kubenswrapper[4786]: E0313 12:02:10.207450 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" containerName="storage" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.207470 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" containerName="storage" Mar 13 12:02:10 crc kubenswrapper[4786]: E0313 12:02:10.207494 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af93e5fc-a1db-44c4-aecb-db9648c603ab" containerName="oc" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.207504 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="af93e5fc-a1db-44c4-aecb-db9648c603ab" containerName="oc" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.207612 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43fae39-7b51-4cc4-bf5e-68fb8cebd4ae" containerName="storage" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.207639 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="af93e5fc-a1db-44c4-aecb-db9648c603ab" containerName="oc" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.208561 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.210190 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.216909 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb"] Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.225203 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brl6g\" (UniqueName: \"kubernetes.io/projected/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-kube-api-access-brl6g\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.225300 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.225345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.326695 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brl6g\" (UniqueName: \"kubernetes.io/projected/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-kube-api-access-brl6g\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.326984 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.327089 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.327710 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.328043 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.358611 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brl6g\" (UniqueName: \"kubernetes.io/projected/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-kube-api-access-brl6g\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.533023 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.750754 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb"] Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.791615 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:02:10 crc kubenswrapper[4786]: I0313 12:02:10.838852 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:02:11 crc kubenswrapper[4786]: I0313 12:02:11.258608 4786 generic.go:334] "Generic (PLEG): container finished" podID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerID="e8fac230d424268dc7f21ccf9a334833ec15435c5846f081ec4b3a2b00af5cd2" exitCode=0 Mar 13 12:02:11 crc kubenswrapper[4786]: I0313 12:02:11.258673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" event={"ID":"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5","Type":"ContainerDied","Data":"e8fac230d424268dc7f21ccf9a334833ec15435c5846f081ec4b3a2b00af5cd2"} Mar 13 12:02:11 crc kubenswrapper[4786]: I0313 12:02:11.258730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" event={"ID":"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5","Type":"ContainerStarted","Data":"b9e7990f31d37bad21401353d565e62b0b81043d428300927f690a219ffd7a99"} Mar 13 12:02:13 crc kubenswrapper[4786]: I0313 12:02:13.271796 4786 generic.go:334] "Generic (PLEG): container finished" podID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerID="1b8318e6eeafd8ad1820d6e71d2282faeba6c1a46b197a52158be80fe025b77f" exitCode=0 Mar 13 12:02:13 crc kubenswrapper[4786]: I0313 12:02:13.271920 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" event={"ID":"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5","Type":"ContainerDied","Data":"1b8318e6eeafd8ad1820d6e71d2282faeba6c1a46b197a52158be80fe025b77f"} Mar 13 12:02:13 crc kubenswrapper[4786]: I0313 12:02:13.855772 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hbx9j"] Mar 13 12:02:13 crc kubenswrapper[4786]: I0313 12:02:13.856535 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hbx9j" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="registry-server" containerID="cri-o://465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d" gracePeriod=2 Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.280717 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.281705 4786 generic.go:334] "Generic (PLEG): container finished" podID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerID="465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d" exitCode=0 Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.281782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerDied","Data":"465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d"} Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.281825 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbx9j" event={"ID":"4137de19-8f78-4d95-bfa9-eaad93f1cbfb","Type":"ContainerDied","Data":"0c409fd66bbbc22e62c7cbabfbeaf11b479f564976ec0d70685d28a1865ed700"} Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.281845 4786 scope.go:117] "RemoveContainer" containerID="465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.288031 4786 generic.go:334] "Generic (PLEG): container finished" podID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerID="2dc621cc41126009352b14b45c5d3c785e827712b37f5efc4e8821c1ac46c5dd" exitCode=0 Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.288085 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" event={"ID":"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5","Type":"ContainerDied","Data":"2dc621cc41126009352b14b45c5d3c785e827712b37f5efc4e8821c1ac46c5dd"} Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.310364 4786 scope.go:117] "RemoveContainer" containerID="7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.349226 4786 scope.go:117] "RemoveContainer" containerID="4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.397284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-utilities\") pod \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.397381 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-catalog-content\") pod \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.397489 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkppz\" (UniqueName: \"kubernetes.io/projected/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-kube-api-access-qkppz\") pod \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\" (UID: \"4137de19-8f78-4d95-bfa9-eaad93f1cbfb\") " Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.398631 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-utilities" (OuterVolumeSpecName: "utilities") pod "4137de19-8f78-4d95-bfa9-eaad93f1cbfb" (UID: "4137de19-8f78-4d95-bfa9-eaad93f1cbfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.407408 4786 scope.go:117] "RemoveContainer" containerID="465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d" Mar 13 12:02:14 crc kubenswrapper[4786]: E0313 12:02:14.407860 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d\": container with ID starting with 465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d not found: ID does not exist" containerID="465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.408007 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d"} err="failed to get container status \"465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d\": rpc error: code = NotFound desc = could not find container \"465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d\": container with ID starting with 465e3a03315aeb8f38d622be2ff7ece677d04697f26a57f0c0d8659860d8776d not found: ID does not exist" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.408036 4786 scope.go:117] "RemoveContainer" containerID="7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105" Mar 13 12:02:14 crc kubenswrapper[4786]: E0313 12:02:14.409271 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105\": container with ID starting with 7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105 not found: ID does not exist" containerID="7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.409303 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105"} err="failed to get container status \"7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105\": rpc error: code = NotFound desc = could not find container \"7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105\": container with ID starting with 7be1ac1974aeea62b75748575dce77550ff7ffa19e1f25dceae49c86c5d29105 not found: ID does not exist" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.409322 4786 scope.go:117] "RemoveContainer" containerID="4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.409271 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-kube-api-access-qkppz" (OuterVolumeSpecName: "kube-api-access-qkppz") pod "4137de19-8f78-4d95-bfa9-eaad93f1cbfb" (UID: "4137de19-8f78-4d95-bfa9-eaad93f1cbfb"). InnerVolumeSpecName "kube-api-access-qkppz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:14 crc kubenswrapper[4786]: E0313 12:02:14.409577 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572\": container with ID starting with 4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572 not found: ID does not exist" containerID="4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.409623 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572"} err="failed to get container status \"4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572\": rpc error: code = NotFound desc = could not find container \"4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572\": container with ID starting with 4a40db5179a692dbb97ee41cd48746efcb5ff17dd7521f45aa576188289db572 not found: ID does not exist" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.455388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4137de19-8f78-4d95-bfa9-eaad93f1cbfb" (UID: "4137de19-8f78-4d95-bfa9-eaad93f1cbfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.498461 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.498492 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:14 crc kubenswrapper[4786]: I0313 12:02:14.498515 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkppz\" (UniqueName: \"kubernetes.io/projected/4137de19-8f78-4d95-bfa9-eaad93f1cbfb-kube-api-access-qkppz\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.296401 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbx9j" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.345438 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hbx9j"] Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.356918 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hbx9j"] Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.452504 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" path="/var/lib/kubelet/pods/4137de19-8f78-4d95-bfa9-eaad93f1cbfb/volumes" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.590650 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.720295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brl6g\" (UniqueName: \"kubernetes.io/projected/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-kube-api-access-brl6g\") pod \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.720436 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-util\") pod \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.720789 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-bundle\") pod \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\" (UID: \"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5\") " Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.723420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-bundle" (OuterVolumeSpecName: "bundle") pod "2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" (UID: "2e5b85ae-234c-423a-bae6-0a3bbe74f5c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.731792 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-kube-api-access-brl6g" (OuterVolumeSpecName: "kube-api-access-brl6g") pod "2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" (UID: "2e5b85ae-234c-423a-bae6-0a3bbe74f5c5"). InnerVolumeSpecName "kube-api-access-brl6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.758336 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-util" (OuterVolumeSpecName: "util") pod "2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" (UID: "2e5b85ae-234c-423a-bae6-0a3bbe74f5c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.823519 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.823600 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brl6g\" (UniqueName: \"kubernetes.io/projected/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-kube-api-access-brl6g\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:15 crc kubenswrapper[4786]: I0313 12:02:15.823622 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e5b85ae-234c-423a-bae6-0a3bbe74f5c5-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:16 crc kubenswrapper[4786]: I0313 12:02:16.306570 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" event={"ID":"2e5b85ae-234c-423a-bae6-0a3bbe74f5c5","Type":"ContainerDied","Data":"b9e7990f31d37bad21401353d565e62b0b81043d428300927f690a219ffd7a99"} Mar 13 12:02:16 crc kubenswrapper[4786]: I0313 12:02:16.306632 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9e7990f31d37bad21401353d565e62b0b81043d428300927f690a219ffd7a99" Mar 13 12:02:16 crc kubenswrapper[4786]: I0313 12:02:16.306721 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402163 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7"] Mar 13 12:02:19 crc kubenswrapper[4786]: E0313 12:02:19.402809 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="util" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402823 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="util" Mar 13 12:02:19 crc kubenswrapper[4786]: E0313 12:02:19.402841 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="registry-server" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402849 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="registry-server" Mar 13 12:02:19 crc kubenswrapper[4786]: E0313 12:02:19.402862 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="extract-utilities" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402870 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="extract-utilities" Mar 13 12:02:19 crc kubenswrapper[4786]: E0313 12:02:19.402904 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="pull" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402912 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="pull" Mar 13 12:02:19 crc kubenswrapper[4786]: E0313 12:02:19.402924 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="extract-content" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402933 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="extract-content" Mar 13 12:02:19 crc kubenswrapper[4786]: E0313 12:02:19.402947 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="extract" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.402955 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="extract" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.403071 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5b85ae-234c-423a-bae6-0a3bbe74f5c5" containerName="extract" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.403084 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4137de19-8f78-4d95-bfa9-eaad93f1cbfb" containerName="registry-server" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.403496 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.405345 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-lmvmb" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.405673 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.417177 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7"] Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.419872 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.465330 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm277\" (UniqueName: \"kubernetes.io/projected/15ab9de9-0f55-424e-9717-d5452bcefb67-kube-api-access-mm277\") pod \"nmstate-operator-796d4cfff4-6q6x7\" (UID: \"15ab9de9-0f55-424e-9717-d5452bcefb67\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.566668 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm277\" (UniqueName: \"kubernetes.io/projected/15ab9de9-0f55-424e-9717-d5452bcefb67-kube-api-access-mm277\") pod \"nmstate-operator-796d4cfff4-6q6x7\" (UID: \"15ab9de9-0f55-424e-9717-d5452bcefb67\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.585183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm277\" (UniqueName: \"kubernetes.io/projected/15ab9de9-0f55-424e-9717-d5452bcefb67-kube-api-access-mm277\") pod \"nmstate-operator-796d4cfff4-6q6x7\" (UID: \"15ab9de9-0f55-424e-9717-d5452bcefb67\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.723873 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" Mar 13 12:02:19 crc kubenswrapper[4786]: I0313 12:02:19.904542 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7"] Mar 13 12:02:20 crc kubenswrapper[4786]: I0313 12:02:20.334507 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" event={"ID":"15ab9de9-0f55-424e-9717-d5452bcefb67","Type":"ContainerStarted","Data":"a959c7d3ca5bc0591abd5b4e68d48ecf1500ce0da94f75198a23c6c0fcb11c40"} Mar 13 12:02:22 crc kubenswrapper[4786]: I0313 12:02:22.353602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" event={"ID":"15ab9de9-0f55-424e-9717-d5452bcefb67","Type":"ContainerStarted","Data":"bb1bae81b37cb33ac3932a1dd4abb1ea55a2801e29a4cb9095d843d3904087cb"} Mar 13 12:02:22 crc kubenswrapper[4786]: I0313 12:02:22.394114 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6q6x7" podStartSLOduration=1.200661997 podStartE2EDuration="3.394086293s" podCreationTimestamp="2026-03-13 12:02:19 +0000 UTC" firstStartedPulling="2026-03-13 12:02:19.911549688 +0000 UTC m=+927.191203135" lastFinishedPulling="2026-03-13 12:02:22.104973984 +0000 UTC m=+929.384627431" observedRunningTime="2026-03-13 12:02:22.386239172 +0000 UTC m=+929.665892719" watchObservedRunningTime="2026-03-13 12:02:22.394086293 +0000 UTC m=+929.673739770" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.260747 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.262023 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.267573 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rrw27" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.272928 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.276782 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.277377 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.279426 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.289575 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kh9b9"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.290162 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.329525 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2hjk\" (UniqueName: \"kubernetes.io/projected/1df03c26-f726-4375-b92a-1d304e653168-kube-api-access-z2hjk\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.329569 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1df03c26-f726-4375-b92a-1d304e653168-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.329614 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwzhr\" (UniqueName: \"kubernetes.io/projected/38c81a5f-f37d-4dc5-aad9-ffe72690e341-kube-api-access-gwzhr\") pod \"nmstate-metrics-9b8c8685d-pbscj\" (UID: \"38c81a5f-f37d-4dc5-aad9-ffe72690e341\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.336724 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.389189 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.389802 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.398779 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.399000 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-59g47" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.399130 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.404930 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431139 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-ovs-socket\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flk98\" (UniqueName: \"kubernetes.io/projected/9d4d1448-188d-4c49-b287-8a7bc2298b06-kube-api-access-flk98\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431226 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2hjk\" (UniqueName: \"kubernetes.io/projected/1df03c26-f726-4375-b92a-1d304e653168-kube-api-access-z2hjk\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431261 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1df03c26-f726-4375-b92a-1d304e653168-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-dbus-socket\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668af0-a94b-4bed-a518-e18d2ac8692d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-nmstate-lock\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88668af0-a94b-4bed-a518-e18d2ac8692d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431366 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwzhr\" (UniqueName: \"kubernetes.io/projected/38c81a5f-f37d-4dc5-aad9-ffe72690e341-kube-api-access-gwzhr\") pod \"nmstate-metrics-9b8c8685d-pbscj\" (UID: \"38c81a5f-f37d-4dc5-aad9-ffe72690e341\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.431388 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsmjh\" (UniqueName: \"kubernetes.io/projected/88668af0-a94b-4bed-a518-e18d2ac8692d-kube-api-access-wsmjh\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: E0313 12:02:30.431418 4786 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 13 12:02:30 crc kubenswrapper[4786]: E0313 12:02:30.431481 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1df03c26-f726-4375-b92a-1d304e653168-tls-key-pair podName:1df03c26-f726-4375-b92a-1d304e653168 nodeName:}" failed. No retries permitted until 2026-03-13 12:02:30.931463172 +0000 UTC m=+938.211116619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/1df03c26-f726-4375-b92a-1d304e653168-tls-key-pair") pod "nmstate-webhook-5f558f5558-7l6wg" (UID: "1df03c26-f726-4375-b92a-1d304e653168") : secret "openshift-nmstate-webhook" not found Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.448673 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2hjk\" (UniqueName: \"kubernetes.io/projected/1df03c26-f726-4375-b92a-1d304e653168-kube-api-access-z2hjk\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.449119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwzhr\" (UniqueName: \"kubernetes.io/projected/38c81a5f-f37d-4dc5-aad9-ffe72690e341-kube-api-access-gwzhr\") pod \"nmstate-metrics-9b8c8685d-pbscj\" (UID: \"38c81a5f-f37d-4dc5-aad9-ffe72690e341\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532193 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668af0-a94b-4bed-a518-e18d2ac8692d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532239 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-nmstate-lock\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88668af0-a94b-4bed-a518-e18d2ac8692d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532293 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsmjh\" (UniqueName: \"kubernetes.io/projected/88668af0-a94b-4bed-a518-e18d2ac8692d-kube-api-access-wsmjh\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-ovs-socket\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flk98\" (UniqueName: \"kubernetes.io/projected/9d4d1448-188d-4c49-b287-8a7bc2298b06-kube-api-access-flk98\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532387 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-dbus-socket\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-dbus-socket\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: E0313 12:02:30.532743 4786 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 13 12:02:30 crc kubenswrapper[4786]: E0313 12:02:30.533038 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88668af0-a94b-4bed-a518-e18d2ac8692d-plugin-serving-cert podName:88668af0-a94b-4bed-a518-e18d2ac8692d nodeName:}" failed. No retries permitted until 2026-03-13 12:02:31.033020368 +0000 UTC m=+938.312673815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/88668af0-a94b-4bed-a518-e18d2ac8692d-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-pgk4q" (UID: "88668af0-a94b-4bed-a518-e18d2ac8692d") : secret "plugin-serving-cert" not found Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.532790 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-nmstate-lock\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.533262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9d4d1448-188d-4c49-b287-8a7bc2298b06-ovs-socket\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.533754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88668af0-a94b-4bed-a518-e18d2ac8692d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.551408 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsmjh\" (UniqueName: \"kubernetes.io/projected/88668af0-a94b-4bed-a518-e18d2ac8692d-kube-api-access-wsmjh\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.554899 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flk98\" (UniqueName: \"kubernetes.io/projected/9d4d1448-188d-4c49-b287-8a7bc2298b06-kube-api-access-flk98\") pod \"nmstate-handler-kh9b9\" (UID: \"9d4d1448-188d-4c49-b287-8a7bc2298b06\") " pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.580176 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.596451 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-65d688f545-qfvjm"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.597062 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.609351 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d688f545-qfvjm"] Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.631173 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:30 crc kubenswrapper[4786]: W0313 12:02:30.686245 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4d1448_188d_4c49_b287_8a7bc2298b06.slice/crio-6f233266f53654a9f85ff097b05736103593823f3f0f5cdd48d25d1009ee05a3 WatchSource:0}: Error finding container 6f233266f53654a9f85ff097b05736103593823f3f0f5cdd48d25d1009ee05a3: Status 404 returned error can't find the container with id 6f233266f53654a9f85ff097b05736103593823f3f0f5cdd48d25d1009ee05a3 Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736459 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-service-ca\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736506 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-serving-cert\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d589d\" (UniqueName: \"kubernetes.io/projected/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-kube-api-access-d589d\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-trusted-ca-bundle\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736581 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-config\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-oauth-serving-cert\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.736653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-oauth-config\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.807558 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj"] Mar 13 12:02:30 crc kubenswrapper[4786]: W0313 12:02:30.813354 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c81a5f_f37d_4dc5_aad9_ffe72690e341.slice/crio-22243590022a25224f795d8c6b37518c003af826b17d47648fea2b57d0b5aa7c WatchSource:0}: Error finding container 22243590022a25224f795d8c6b37518c003af826b17d47648fea2b57d0b5aa7c: Status 404 returned error can't find the container with id 22243590022a25224f795d8c6b37518c003af826b17d47648fea2b57d0b5aa7c Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837563 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-oauth-serving-cert\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837638 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-oauth-config\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-service-ca\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837690 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-serving-cert\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837717 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d589d\" (UniqueName: \"kubernetes.io/projected/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-kube-api-access-d589d\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837734 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-trusted-ca-bundle\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.837762 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-config\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.838494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-oauth-serving-cert\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.838609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-config\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.838619 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-service-ca\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.838909 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-trusted-ca-bundle\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.843320 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-oauth-config\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.843434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-console-serving-cert\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.852654 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d589d\" (UniqueName: \"kubernetes.io/projected/7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca-kube-api-access-d589d\") pod \"console-65d688f545-qfvjm\" (UID: \"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca\") " pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.938791 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1df03c26-f726-4375-b92a-1d304e653168-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.943174 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1df03c26-f726-4375-b92a-1d304e653168-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7l6wg\" (UID: \"1df03c26-f726-4375-b92a-1d304e653168\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:30 crc kubenswrapper[4786]: I0313 12:02:30.965072 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.041478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668af0-a94b-4bed-a518-e18d2ac8692d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.045873 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668af0-a94b-4bed-a518-e18d2ac8692d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pgk4q\" (UID: \"88668af0-a94b-4bed-a518-e18d2ac8692d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.194613 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.201094 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d688f545-qfvjm"] Mar 13 12:02:31 crc kubenswrapper[4786]: W0313 12:02:31.205425 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfd9b71_30bd_48a5_b1a5_b84aab2da9ca.slice/crio-c00f96de30583328b7fc603708a918f453bb61b91b0a5fe4d5fd42df3003dcae WatchSource:0}: Error finding container c00f96de30583328b7fc603708a918f453bb61b91b0a5fe4d5fd42df3003dcae: Status 404 returned error can't find the container with id c00f96de30583328b7fc603708a918f453bb61b91b0a5fe4d5fd42df3003dcae Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.303621 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.400548 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg"] Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.438165 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" event={"ID":"38c81a5f-f37d-4dc5-aad9-ffe72690e341","Type":"ContainerStarted","Data":"22243590022a25224f795d8c6b37518c003af826b17d47648fea2b57d0b5aa7c"} Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.448862 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d688f545-qfvjm" event={"ID":"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca","Type":"ContainerStarted","Data":"86dd49ee9eb069c962a835ee0c6601dedbffa44480233d5433aa4396bebd96b7"} Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.448929 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d688f545-qfvjm" event={"ID":"7cfd9b71-30bd-48a5-b1a5-b84aab2da9ca","Type":"ContainerStarted","Data":"c00f96de30583328b7fc603708a918f453bb61b91b0a5fe4d5fd42df3003dcae"} Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.448944 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" event={"ID":"1df03c26-f726-4375-b92a-1d304e653168","Type":"ContainerStarted","Data":"33943efc2cbcdbfbb058bcd740d31cfc1d41741e34d30bb35651b3053c6b2534"} Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.448957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kh9b9" event={"ID":"9d4d1448-188d-4c49-b287-8a7bc2298b06","Type":"ContainerStarted","Data":"6f233266f53654a9f85ff097b05736103593823f3f0f5cdd48d25d1009ee05a3"} Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.463943 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65d688f545-qfvjm" podStartSLOduration=1.463922049 podStartE2EDuration="1.463922049s" podCreationTimestamp="2026-03-13 12:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:02:31.458534651 +0000 UTC m=+938.738188118" watchObservedRunningTime="2026-03-13 12:02:31.463922049 +0000 UTC m=+938.743575516" Mar 13 12:02:31 crc kubenswrapper[4786]: I0313 12:02:31.531436 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q"] Mar 13 12:02:31 crc kubenswrapper[4786]: W0313 12:02:31.533653 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88668af0_a94b_4bed_a518_e18d2ac8692d.slice/crio-5d9e53ebd2f1a91d4273312af07dee1474bb30a8e13ffe89d66fe531200b8cb8 WatchSource:0}: Error finding container 5d9e53ebd2f1a91d4273312af07dee1474bb30a8e13ffe89d66fe531200b8cb8: Status 404 returned error can't find the container with id 5d9e53ebd2f1a91d4273312af07dee1474bb30a8e13ffe89d66fe531200b8cb8 Mar 13 12:02:32 crc kubenswrapper[4786]: I0313 12:02:32.453754 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" event={"ID":"88668af0-a94b-4bed-a518-e18d2ac8692d","Type":"ContainerStarted","Data":"5d9e53ebd2f1a91d4273312af07dee1474bb30a8e13ffe89d66fe531200b8cb8"} Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.469907 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" event={"ID":"38c81a5f-f37d-4dc5-aad9-ffe72690e341","Type":"ContainerStarted","Data":"95f3aac526033413557328b9ade2b4f29a38d1a6f3d6b92a05b7e432ba9c0618"} Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.473815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" event={"ID":"1df03c26-f726-4375-b92a-1d304e653168","Type":"ContainerStarted","Data":"8c268ef31f5f73952bddfff92c0c2ebeca40cb1af557ffd560ef3edbde47cf50"} Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.474026 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.475401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kh9b9" event={"ID":"9d4d1448-188d-4c49-b287-8a7bc2298b06","Type":"ContainerStarted","Data":"28b16259acd18913d9f477e3623b5deed06e76285ad0135ebf223d28c502bd76"} Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.475616 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.524619 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kh9b9" podStartSLOduration=1.159690794 podStartE2EDuration="3.524588522s" podCreationTimestamp="2026-03-13 12:02:30 +0000 UTC" firstStartedPulling="2026-03-13 12:02:30.703060764 +0000 UTC m=+937.982714211" lastFinishedPulling="2026-03-13 12:02:33.067958492 +0000 UTC m=+940.347611939" observedRunningTime="2026-03-13 12:02:33.518183509 +0000 UTC m=+940.797836966" watchObservedRunningTime="2026-03-13 12:02:33.524588522 +0000 UTC m=+940.804241989" Mar 13 12:02:33 crc kubenswrapper[4786]: I0313 12:02:33.541456 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" podStartSLOduration=1.873375163 podStartE2EDuration="3.541394382s" podCreationTimestamp="2026-03-13 12:02:30 +0000 UTC" firstStartedPulling="2026-03-13 12:02:31.43500726 +0000 UTC m=+938.714660697" lastFinishedPulling="2026-03-13 12:02:33.103026469 +0000 UTC m=+940.382679916" observedRunningTime="2026-03-13 12:02:33.535897311 +0000 UTC m=+940.815550778" watchObservedRunningTime="2026-03-13 12:02:33.541394382 +0000 UTC m=+940.821047849" Mar 13 12:02:34 crc kubenswrapper[4786]: I0313 12:02:34.483208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" event={"ID":"88668af0-a94b-4bed-a518-e18d2ac8692d","Type":"ContainerStarted","Data":"3490d1e7b3df6e8ffe5faa9acadc5cdde87c5fd42fd21c0367a1d489f0aea0a4"} Mar 13 12:02:34 crc kubenswrapper[4786]: I0313 12:02:34.503991 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pgk4q" podStartSLOduration=2.116348053 podStartE2EDuration="4.503968472s" podCreationTimestamp="2026-03-13 12:02:30 +0000 UTC" firstStartedPulling="2026-03-13 12:02:31.535512469 +0000 UTC m=+938.815165916" lastFinishedPulling="2026-03-13 12:02:33.923132878 +0000 UTC m=+941.202786335" observedRunningTime="2026-03-13 12:02:34.497744553 +0000 UTC m=+941.777398040" watchObservedRunningTime="2026-03-13 12:02:34.503968472 +0000 UTC m=+941.783621959" Mar 13 12:02:36 crc kubenswrapper[4786]: I0313 12:02:36.499430 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" event={"ID":"38c81a5f-f37d-4dc5-aad9-ffe72690e341","Type":"ContainerStarted","Data":"ab309bbb7ee1fdd93954b16ace9825f03ece2b0a24592f2fbe74b558e117d4f8"} Mar 13 12:02:36 crc kubenswrapper[4786]: I0313 12:02:36.532447 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pbscj" podStartSLOduration=1.694148902 podStartE2EDuration="6.532411143s" podCreationTimestamp="2026-03-13 12:02:30 +0000 UTC" firstStartedPulling="2026-03-13 12:02:30.816285167 +0000 UTC m=+938.095938614" lastFinishedPulling="2026-03-13 12:02:35.654547398 +0000 UTC m=+942.934200855" observedRunningTime="2026-03-13 12:02:36.525984168 +0000 UTC m=+943.805637675" watchObservedRunningTime="2026-03-13 12:02:36.532411143 +0000 UTC m=+943.812064630" Mar 13 12:02:40 crc kubenswrapper[4786]: I0313 12:02:40.673730 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kh9b9" Mar 13 12:02:40 crc kubenswrapper[4786]: I0313 12:02:40.966381 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:40 crc kubenswrapper[4786]: I0313 12:02:40.966460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:40 crc kubenswrapper[4786]: I0313 12:02:40.975483 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:41 crc kubenswrapper[4786]: I0313 12:02:41.537650 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65d688f545-qfvjm" Mar 13 12:02:41 crc kubenswrapper[4786]: I0313 12:02:41.631021 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nzss5"] Mar 13 12:02:51 crc kubenswrapper[4786]: I0313 12:02:51.202526 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7l6wg" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.333028 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv"] Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.335534 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.341102 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.344197 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv"] Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.433088 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2ms\" (UniqueName: \"kubernetes.io/projected/cc164e59-8f60-4750-ab7b-935346318ac8-kube-api-access-7q2ms\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.433486 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.433575 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.534263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.534390 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.534446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q2ms\" (UniqueName: \"kubernetes.io/projected/cc164e59-8f60-4750-ab7b-935346318ac8-kube-api-access-7q2ms\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.534998 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.536036 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.569334 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q2ms\" (UniqueName: \"kubernetes.io/projected/cc164e59-8f60-4750-ab7b-935346318ac8-kube-api-access-7q2ms\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.677935 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.677916 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nzss5" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" containerID="cri-o://082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075" gracePeriod=15 Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.858544 4786 patch_prober.go:28] interesting pod/console-f9d7485db-nzss5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 13 12:03:06 crc kubenswrapper[4786]: I0313 12:03:06.858798 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-nzss5" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.054850 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nzss5_ed0ec184-b55e-474a-9e11-72957a85689d/console/0.log" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.054963 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.137783 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv"] Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-trusted-ca-bundle\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141378 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-oauth-serving-cert\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141424 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-console-config\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141486 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46p2f\" (UniqueName: \"kubernetes.io/projected/ed0ec184-b55e-474a-9e11-72957a85689d-kube-api-access-46p2f\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141521 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-serving-cert\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141540 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-service-ca\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.141564 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-oauth-config\") pod \"ed0ec184-b55e-474a-9e11-72957a85689d\" (UID: \"ed0ec184-b55e-474a-9e11-72957a85689d\") " Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.142318 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.142330 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.142344 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-console-config" (OuterVolumeSpecName: "console-config") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.142355 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-service-ca" (OuterVolumeSpecName: "service-ca") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.148234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.148396 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0ec184-b55e-474a-9e11-72957a85689d-kube-api-access-46p2f" (OuterVolumeSpecName: "kube-api-access-46p2f") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "kube-api-access-46p2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.149569 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ed0ec184-b55e-474a-9e11-72957a85689d" (UID: "ed0ec184-b55e-474a-9e11-72957a85689d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244056 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46p2f\" (UniqueName: \"kubernetes.io/projected/ed0ec184-b55e-474a-9e11-72957a85689d-kube-api-access-46p2f\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244111 4786 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244135 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244158 4786 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0ec184-b55e-474a-9e11-72957a85689d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244180 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244205 4786 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.244226 4786 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0ec184-b55e-474a-9e11-72957a85689d-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.715523 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nzss5_ed0ec184-b55e-474a-9e11-72957a85689d/console/0.log" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.715841 4786 generic.go:334] "Generic (PLEG): container finished" podID="ed0ec184-b55e-474a-9e11-72957a85689d" containerID="082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075" exitCode=2 Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.715948 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nzss5" event={"ID":"ed0ec184-b55e-474a-9e11-72957a85689d","Type":"ContainerDied","Data":"082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075"} Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.715965 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nzss5" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.715991 4786 scope.go:117] "RemoveContainer" containerID="082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.715979 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nzss5" event={"ID":"ed0ec184-b55e-474a-9e11-72957a85689d","Type":"ContainerDied","Data":"fe5d9ef7255e9e108a783b4a0bfb3f12716e9a99ab2c438e1fe8fc9db78b2bc9"} Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.720125 4786 generic.go:334] "Generic (PLEG): container finished" podID="cc164e59-8f60-4750-ab7b-935346318ac8" containerID="dcfcd40a3545d4bc67a69f12b504318a7d6730ef5031afcc4de2c605603209eb" exitCode=0 Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.720175 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" event={"ID":"cc164e59-8f60-4750-ab7b-935346318ac8","Type":"ContainerDied","Data":"dcfcd40a3545d4bc67a69f12b504318a7d6730ef5031afcc4de2c605603209eb"} Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.720205 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" event={"ID":"cc164e59-8f60-4750-ab7b-935346318ac8","Type":"ContainerStarted","Data":"33e76312483c79c4f07ac6b8ea46f6c1cfee27c13100d43f55d0c45eb82d8620"} Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.744544 4786 scope.go:117] "RemoveContainer" containerID="082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075" Mar 13 12:03:07 crc kubenswrapper[4786]: E0313 12:03:07.745233 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075\": container with ID starting with 082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075 not found: ID does not exist" containerID="082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.745268 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075"} err="failed to get container status \"082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075\": rpc error: code = NotFound desc = could not find container \"082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075\": container with ID starting with 082910f079acc83709dbfd5f1bbc8a2361a09050b73035100e9f1bf5f9bdc075 not found: ID does not exist" Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.749999 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nzss5"] Mar 13 12:03:07 crc kubenswrapper[4786]: I0313 12:03:07.757036 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nzss5"] Mar 13 12:03:09 crc kubenswrapper[4786]: I0313 12:03:09.452767 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" path="/var/lib/kubelet/pods/ed0ec184-b55e-474a-9e11-72957a85689d/volumes" Mar 13 12:03:09 crc kubenswrapper[4786]: I0313 12:03:09.746508 4786 generic.go:334] "Generic (PLEG): container finished" podID="cc164e59-8f60-4750-ab7b-935346318ac8" containerID="09b344e78bd6fdf9cfffdc07a0e5378940e9689d8f72dcf38b5cede94d81627f" exitCode=0 Mar 13 12:03:09 crc kubenswrapper[4786]: I0313 12:03:09.746569 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" event={"ID":"cc164e59-8f60-4750-ab7b-935346318ac8","Type":"ContainerDied","Data":"09b344e78bd6fdf9cfffdc07a0e5378940e9689d8f72dcf38b5cede94d81627f"} Mar 13 12:03:10 crc kubenswrapper[4786]: I0313 12:03:10.756219 4786 generic.go:334] "Generic (PLEG): container finished" podID="cc164e59-8f60-4750-ab7b-935346318ac8" containerID="512d7415aa2d46e4709f486cd3035e0b223ed6ff9bb2f4c7fa8c75f4a5e2649e" exitCode=0 Mar 13 12:03:10 crc kubenswrapper[4786]: I0313 12:03:10.756275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" event={"ID":"cc164e59-8f60-4750-ab7b-935346318ac8","Type":"ContainerDied","Data":"512d7415aa2d46e4709f486cd3035e0b223ed6ff9bb2f4c7fa8c75f4a5e2649e"} Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.078467 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.106498 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-bundle\") pod \"cc164e59-8f60-4750-ab7b-935346318ac8\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.106650 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-util\") pod \"cc164e59-8f60-4750-ab7b-935346318ac8\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.106691 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q2ms\" (UniqueName: \"kubernetes.io/projected/cc164e59-8f60-4750-ab7b-935346318ac8-kube-api-access-7q2ms\") pod \"cc164e59-8f60-4750-ab7b-935346318ac8\" (UID: \"cc164e59-8f60-4750-ab7b-935346318ac8\") " Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.108749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-bundle" (OuterVolumeSpecName: "bundle") pod "cc164e59-8f60-4750-ab7b-935346318ac8" (UID: "cc164e59-8f60-4750-ab7b-935346318ac8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.119158 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc164e59-8f60-4750-ab7b-935346318ac8-kube-api-access-7q2ms" (OuterVolumeSpecName: "kube-api-access-7q2ms") pod "cc164e59-8f60-4750-ab7b-935346318ac8" (UID: "cc164e59-8f60-4750-ab7b-935346318ac8"). InnerVolumeSpecName "kube-api-access-7q2ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.139872 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-util" (OuterVolumeSpecName: "util") pod "cc164e59-8f60-4750-ab7b-935346318ac8" (UID: "cc164e59-8f60-4750-ab7b-935346318ac8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.208597 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.208654 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q2ms\" (UniqueName: \"kubernetes.io/projected/cc164e59-8f60-4750-ab7b-935346318ac8-kube-api-access-7q2ms\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.208693 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc164e59-8f60-4750-ab7b-935346318ac8-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.776129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" event={"ID":"cc164e59-8f60-4750-ab7b-935346318ac8","Type":"ContainerDied","Data":"33e76312483c79c4f07ac6b8ea46f6c1cfee27c13100d43f55d0c45eb82d8620"} Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.776631 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33e76312483c79c4f07ac6b8ea46f6c1cfee27c13100d43f55d0c45eb82d8620" Mar 13 12:03:12 crc kubenswrapper[4786]: I0313 12:03:12.776267 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852043 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz"] Mar 13 12:03:21 crc kubenswrapper[4786]: E0313 12:03:21.852620 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852632 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" Mar 13 12:03:21 crc kubenswrapper[4786]: E0313 12:03:21.852643 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="extract" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852649 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="extract" Mar 13 12:03:21 crc kubenswrapper[4786]: E0313 12:03:21.852664 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="pull" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852670 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="pull" Mar 13 12:03:21 crc kubenswrapper[4786]: E0313 12:03:21.852676 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="util" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852682 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="util" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852768 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc164e59-8f60-4750-ab7b-935346318ac8" containerName="extract" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.852844 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ec184-b55e-474a-9e11-72957a85689d" containerName="console" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.853216 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.854529 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.854849 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.855003 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.855612 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tg98m" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.855852 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.873880 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz"] Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.984872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f40266-579b-4e46-8c76-6085fc8b2824-apiservice-cert\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.984944 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f40266-579b-4e46-8c76-6085fc8b2824-webhook-cert\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:21 crc kubenswrapper[4786]: I0313 12:03:21.984975 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjg4l\" (UniqueName: \"kubernetes.io/projected/d7f40266-579b-4e46-8c76-6085fc8b2824-kube-api-access-zjg4l\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.085954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f40266-579b-4e46-8c76-6085fc8b2824-apiservice-cert\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.086038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f40266-579b-4e46-8c76-6085fc8b2824-webhook-cert\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.086083 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjg4l\" (UniqueName: \"kubernetes.io/projected/d7f40266-579b-4e46-8c76-6085fc8b2824-kube-api-access-zjg4l\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.093582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f40266-579b-4e46-8c76-6085fc8b2824-apiservice-cert\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.093582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f40266-579b-4e46-8c76-6085fc8b2824-webhook-cert\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.104445 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjg4l\" (UniqueName: \"kubernetes.io/projected/d7f40266-579b-4e46-8c76-6085fc8b2824-kube-api-access-zjg4l\") pod \"metallb-operator-controller-manager-5f7647b4b8-hmxxz\" (UID: \"d7f40266-579b-4e46-8c76-6085fc8b2824\") " pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.167974 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.172089 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt"] Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.172728 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.174314 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.175043 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.180815 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-h4jpg" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.200968 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt"] Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.289166 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4473a40-978f-429a-81cb-34ca70c51ecc-apiservice-cert\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.289510 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4473a40-978f-429a-81cb-34ca70c51ecc-webhook-cert\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.289584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbdm\" (UniqueName: \"kubernetes.io/projected/d4473a40-978f-429a-81cb-34ca70c51ecc-kube-api-access-6cbdm\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.390474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4473a40-978f-429a-81cb-34ca70c51ecc-apiservice-cert\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.390518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4473a40-978f-429a-81cb-34ca70c51ecc-webhook-cert\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.390566 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbdm\" (UniqueName: \"kubernetes.io/projected/d4473a40-978f-429a-81cb-34ca70c51ecc-kube-api-access-6cbdm\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.403095 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4473a40-978f-429a-81cb-34ca70c51ecc-apiservice-cert\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.403339 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4473a40-978f-429a-81cb-34ca70c51ecc-webhook-cert\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.417172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbdm\" (UniqueName: \"kubernetes.io/projected/d4473a40-978f-429a-81cb-34ca70c51ecc-kube-api-access-6cbdm\") pod \"metallb-operator-webhook-server-6958cc8947-vbltt\" (UID: \"d4473a40-978f-429a-81cb-34ca70c51ecc\") " pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.424991 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz"] Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.523834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.737100 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt"] Mar 13 12:03:22 crc kubenswrapper[4786]: W0313 12:03:22.743301 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4473a40_978f_429a_81cb_34ca70c51ecc.slice/crio-064844361f922570b9a02afda1ee9db448aa7d931e845e963fac41d2449ffe4c WatchSource:0}: Error finding container 064844361f922570b9a02afda1ee9db448aa7d931e845e963fac41d2449ffe4c: Status 404 returned error can't find the container with id 064844361f922570b9a02afda1ee9db448aa7d931e845e963fac41d2449ffe4c Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.835261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" event={"ID":"d4473a40-978f-429a-81cb-34ca70c51ecc","Type":"ContainerStarted","Data":"064844361f922570b9a02afda1ee9db448aa7d931e845e963fac41d2449ffe4c"} Mar 13 12:03:22 crc kubenswrapper[4786]: I0313 12:03:22.836753 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" event={"ID":"d7f40266-579b-4e46-8c76-6085fc8b2824","Type":"ContainerStarted","Data":"0e895b6e804cb6d4ebfc2fec8f0d7f61f46d552c47b9f2bf322a74e4b20bcd1b"} Mar 13 12:03:27 crc kubenswrapper[4786]: I0313 12:03:27.871242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" event={"ID":"d4473a40-978f-429a-81cb-34ca70c51ecc","Type":"ContainerStarted","Data":"eb17a98159b498fd27dcac2c386334b9e29b74328f583b361d157ecec04b4286"} Mar 13 12:03:27 crc kubenswrapper[4786]: I0313 12:03:27.872187 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:03:27 crc kubenswrapper[4786]: I0313 12:03:27.873289 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" event={"ID":"d7f40266-579b-4e46-8c76-6085fc8b2824","Type":"ContainerStarted","Data":"60c70715d2aa2ab9d725e10efae4b6ba3e34a410be66363531bcf115e33c91a7"} Mar 13 12:03:27 crc kubenswrapper[4786]: I0313 12:03:27.873450 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:03:27 crc kubenswrapper[4786]: I0313 12:03:27.892272 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" podStartSLOduration=1.458153388 podStartE2EDuration="5.892259766s" podCreationTimestamp="2026-03-13 12:03:22 +0000 UTC" firstStartedPulling="2026-03-13 12:03:22.746704724 +0000 UTC m=+990.026358171" lastFinishedPulling="2026-03-13 12:03:27.180811102 +0000 UTC m=+994.460464549" observedRunningTime="2026-03-13 12:03:27.891156587 +0000 UTC m=+995.170810044" watchObservedRunningTime="2026-03-13 12:03:27.892259766 +0000 UTC m=+995.171913213" Mar 13 12:03:27 crc kubenswrapper[4786]: I0313 12:03:27.919702 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" podStartSLOduration=2.20067877 podStartE2EDuration="6.919672251s" podCreationTimestamp="2026-03-13 12:03:21 +0000 UTC" firstStartedPulling="2026-03-13 12:03:22.436347848 +0000 UTC m=+989.716001295" lastFinishedPulling="2026-03-13 12:03:27.155341319 +0000 UTC m=+994.434994776" observedRunningTime="2026-03-13 12:03:27.911222784 +0000 UTC m=+995.190876271" watchObservedRunningTime="2026-03-13 12:03:27.919672251 +0000 UTC m=+995.199325708" Mar 13 12:03:42 crc kubenswrapper[4786]: I0313 12:03:42.530252 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6958cc8947-vbltt" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.145635 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556724-lq5fw"] Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.147346 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.150205 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.150818 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.153854 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-lq5fw"] Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.155358 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.250558 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tsvt\" (UniqueName: \"kubernetes.io/projected/b6a1d014-b7f1-4009-942d-3a4794a8f675-kube-api-access-8tsvt\") pod \"auto-csr-approver-29556724-lq5fw\" (UID: \"b6a1d014-b7f1-4009-942d-3a4794a8f675\") " pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.351786 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tsvt\" (UniqueName: \"kubernetes.io/projected/b6a1d014-b7f1-4009-942d-3a4794a8f675-kube-api-access-8tsvt\") pod \"auto-csr-approver-29556724-lq5fw\" (UID: \"b6a1d014-b7f1-4009-942d-3a4794a8f675\") " pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.379825 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tsvt\" (UniqueName: \"kubernetes.io/projected/b6a1d014-b7f1-4009-942d-3a4794a8f675-kube-api-access-8tsvt\") pod \"auto-csr-approver-29556724-lq5fw\" (UID: \"b6a1d014-b7f1-4009-942d-3a4794a8f675\") " pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.481433 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:00 crc kubenswrapper[4786]: I0313 12:04:00.918267 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-lq5fw"] Mar 13 12:04:01 crc kubenswrapper[4786]: I0313 12:04:01.109573 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" event={"ID":"b6a1d014-b7f1-4009-942d-3a4794a8f675","Type":"ContainerStarted","Data":"4f76e3e6d7667c87fc0123d419d63bf79f2b7cf37504b72ab34bf66c08b4e6f2"} Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.117019 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" event={"ID":"b6a1d014-b7f1-4009-942d-3a4794a8f675","Type":"ContainerStarted","Data":"8e2665de53fe5c4b48e3c6283ea00b58ec23a8c2f87e821662e474a78b5088b8"} Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.136767 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" podStartSLOduration=1.312565783 podStartE2EDuration="2.136743199s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:00.92803779 +0000 UTC m=+1028.207691247" lastFinishedPulling="2026-03-13 12:04:01.752215216 +0000 UTC m=+1029.031868663" observedRunningTime="2026-03-13 12:04:02.132839464 +0000 UTC m=+1029.412492941" watchObservedRunningTime="2026-03-13 12:04:02.136743199 +0000 UTC m=+1029.416396686" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.172214 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5f7647b4b8-hmxxz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.860938 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh"] Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.861742 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.866317 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.866632 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kj79g" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.875995 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mz2wz"] Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.878744 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.881248 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.881475 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.887038 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh"] Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.958286 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gj86q"] Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.959133 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gj86q" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.960674 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.961127 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.961233 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.961545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-n8dh5" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.973936 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-f4p6b"] Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.974723 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:02 crc kubenswrapper[4786]: W0313 12:04:02.976145 4786 reflector.go:561] object-"metallb-system"/"controller-certs-secret": failed to list *v1.Secret: secrets "controller-certs-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 13 12:04:02 crc kubenswrapper[4786]: E0313 12:04:02.976184 4786 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-certs-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-certs-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988020 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fqf\" (UniqueName: \"kubernetes.io/projected/f0ef5780-741a-4adc-a453-fcf5f4a8813e-kube-api-access-h7fqf\") pod \"frr-k8s-webhook-server-bcc4b6f68-pxszh\" (UID: \"f0ef5780-741a-4adc-a453-fcf5f4a8813e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-frr-sockets\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-reloader\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988111 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62899598-48ee-4c70-8641-5f7defde9e8f-metrics-certs\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988198 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-metrics\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqrg\" (UniqueName: \"kubernetes.io/projected/62899598-48ee-4c70-8641-5f7defde9e8f-kube-api-access-jdqrg\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-frr-conf\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/62899598-48ee-4c70-8641-5f7defde9e8f-frr-startup\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.988434 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ef5780-741a-4adc-a453-fcf5f4a8813e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pxszh\" (UID: \"f0ef5780-741a-4adc-a453-fcf5f4a8813e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:02 crc kubenswrapper[4786]: I0313 12:04:02.990423 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-f4p6b"] Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089428 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9500808-ad3b-464f-ac49-ddb93b08f58e-metallb-excludel2\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089473 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ef5780-741a-4adc-a453-fcf5f4a8813e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pxszh\" (UID: \"f0ef5780-741a-4adc-a453-fcf5f4a8813e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089493 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-metrics-certs\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089513 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpkph\" (UniqueName: \"kubernetes.io/projected/c9500808-ad3b-464f-ac49-ddb93b08f58e-kube-api-access-kpkph\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fqf\" (UniqueName: \"kubernetes.io/projected/f0ef5780-741a-4adc-a453-fcf5f4a8813e-kube-api-access-h7fqf\") pod \"frr-k8s-webhook-server-bcc4b6f68-pxszh\" (UID: \"f0ef5780-741a-4adc-a453-fcf5f4a8813e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-frr-sockets\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089575 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-reloader\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089590 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62899598-48ee-4c70-8641-5f7defde9e8f-metrics-certs\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089612 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-metrics-certs\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089627 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-metrics\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqrg\" (UniqueName: \"kubernetes.io/projected/62899598-48ee-4c70-8641-5f7defde9e8f-kube-api-access-jdqrg\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089694 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-frr-conf\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089714 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-cert\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089732 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/62899598-48ee-4c70-8641-5f7defde9e8f-frr-startup\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.089751 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrvn\" (UniqueName: \"kubernetes.io/projected/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-kube-api-access-zqrvn\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.091128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-frr-conf\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.091160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-frr-sockets\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.091279 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-reloader\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.091446 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/62899598-48ee-4c70-8641-5f7defde9e8f-frr-startup\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.091499 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/62899598-48ee-4c70-8641-5f7defde9e8f-metrics\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.098505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62899598-48ee-4c70-8641-5f7defde9e8f-metrics-certs\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.098678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ef5780-741a-4adc-a453-fcf5f4a8813e-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pxszh\" (UID: \"f0ef5780-741a-4adc-a453-fcf5f4a8813e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.105276 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqrg\" (UniqueName: \"kubernetes.io/projected/62899598-48ee-4c70-8641-5f7defde9e8f-kube-api-access-jdqrg\") pod \"frr-k8s-mz2wz\" (UID: \"62899598-48ee-4c70-8641-5f7defde9e8f\") " pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.106810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fqf\" (UniqueName: \"kubernetes.io/projected/f0ef5780-741a-4adc-a453-fcf5f4a8813e-kube-api-access-h7fqf\") pod \"frr-k8s-webhook-server-bcc4b6f68-pxszh\" (UID: \"f0ef5780-741a-4adc-a453-fcf5f4a8813e\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.123446 4786 generic.go:334] "Generic (PLEG): container finished" podID="b6a1d014-b7f1-4009-942d-3a4794a8f675" containerID="8e2665de53fe5c4b48e3c6283ea00b58ec23a8c2f87e821662e474a78b5088b8" exitCode=0 Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.123495 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" event={"ID":"b6a1d014-b7f1-4009-942d-3a4794a8f675","Type":"ContainerDied","Data":"8e2665de53fe5c4b48e3c6283ea00b58ec23a8c2f87e821662e474a78b5088b8"} Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.177325 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.190596 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-metrics-certs\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.190647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.190701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-cert\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.190763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrvn\" (UniqueName: \"kubernetes.io/projected/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-kube-api-access-zqrvn\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: E0313 12:04:03.190871 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:04:03 crc kubenswrapper[4786]: E0313 12:04:03.190988 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist podName:c9500808-ad3b-464f-ac49-ddb93b08f58e nodeName:}" failed. No retries permitted until 2026-03-13 12:04:03.690964288 +0000 UTC m=+1030.970617725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist") pod "speaker-gj86q" (UID: "c9500808-ad3b-464f-ac49-ddb93b08f58e") : secret "metallb-memberlist" not found Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.191112 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9500808-ad3b-464f-ac49-ddb93b08f58e-metallb-excludel2\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.191146 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-metrics-certs\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.191171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpkph\" (UniqueName: \"kubernetes.io/projected/c9500808-ad3b-464f-ac49-ddb93b08f58e-kube-api-access-kpkph\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.191929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9500808-ad3b-464f-ac49-ddb93b08f58e-metallb-excludel2\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.193155 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.193288 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.194831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-metrics-certs\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.206303 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-cert\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.211434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpkph\" (UniqueName: \"kubernetes.io/projected/c9500808-ad3b-464f-ac49-ddb93b08f58e-kube-api-access-kpkph\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.211539 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrvn\" (UniqueName: \"kubernetes.io/projected/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-kube-api-access-zqrvn\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.385006 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh"] Mar 13 12:04:03 crc kubenswrapper[4786]: W0313 12:04:03.391115 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ef5780_741a_4adc_a453_fcf5f4a8813e.slice/crio-aaa0a795a29df51790083aa3044a665a4399437b17decc263e7ca37db0d681fc WatchSource:0}: Error finding container aaa0a795a29df51790083aa3044a665a4399437b17decc263e7ca37db0d681fc: Status 404 returned error can't find the container with id aaa0a795a29df51790083aa3044a665a4399437b17decc263e7ca37db0d681fc Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.699180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:03 crc kubenswrapper[4786]: E0313 12:04:03.699357 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:04:03 crc kubenswrapper[4786]: E0313 12:04:03.699641 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist podName:c9500808-ad3b-464f-ac49-ddb93b08f58e nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.699622408 +0000 UTC m=+1031.979275855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist") pod "speaker-gj86q" (UID: "c9500808-ad3b-464f-ac49-ddb93b08f58e") : secret "metallb-memberlist" not found Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.940783 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 12:04:03 crc kubenswrapper[4786]: I0313 12:04:03.950628 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2-metrics-certs\") pod \"controller-7bb4cc7c98-f4p6b\" (UID: \"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2\") " pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.132304 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"f318e454349f5b29eb3516426121057f3614d45b18280218b825db45b74fe2a9"} Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.133506 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" event={"ID":"f0ef5780-741a-4adc-a453-fcf5f4a8813e","Type":"ContainerStarted","Data":"aaa0a795a29df51790083aa3044a665a4399437b17decc263e7ca37db0d681fc"} Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.190089 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.385343 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.510415 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tsvt\" (UniqueName: \"kubernetes.io/projected/b6a1d014-b7f1-4009-942d-3a4794a8f675-kube-api-access-8tsvt\") pod \"b6a1d014-b7f1-4009-942d-3a4794a8f675\" (UID: \"b6a1d014-b7f1-4009-942d-3a4794a8f675\") " Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.527738 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a1d014-b7f1-4009-942d-3a4794a8f675-kube-api-access-8tsvt" (OuterVolumeSpecName: "kube-api-access-8tsvt") pod "b6a1d014-b7f1-4009-942d-3a4794a8f675" (UID: "b6a1d014-b7f1-4009-942d-3a4794a8f675"). InnerVolumeSpecName "kube-api-access-8tsvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.612794 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tsvt\" (UniqueName: \"kubernetes.io/projected/b6a1d014-b7f1-4009-942d-3a4794a8f675-kube-api-access-8tsvt\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.709913 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-f4p6b"] Mar 13 12:04:04 crc kubenswrapper[4786]: I0313 12:04:04.715947 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:04 crc kubenswrapper[4786]: E0313 12:04:04.716237 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:04:04 crc kubenswrapper[4786]: E0313 12:04:04.716328 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist podName:c9500808-ad3b-464f-ac49-ddb93b08f58e nodeName:}" failed. No retries permitted until 2026-03-13 12:04:06.716306462 +0000 UTC m=+1033.995959909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist") pod "speaker-gj86q" (UID: "c9500808-ad3b-464f-ac49-ddb93b08f58e") : secret "metallb-memberlist" not found Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.143341 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" event={"ID":"b6a1d014-b7f1-4009-942d-3a4794a8f675","Type":"ContainerDied","Data":"4f76e3e6d7667c87fc0123d419d63bf79f2b7cf37504b72ab34bf66c08b4e6f2"} Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.143378 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f76e3e6d7667c87fc0123d419d63bf79f2b7cf37504b72ab34bf66c08b4e6f2" Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.143423 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-lq5fw" Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.153169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-f4p6b" event={"ID":"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2","Type":"ContainerStarted","Data":"4d418ac3566cb81b46febfcac90b2c3302796567828ce008881da8d246a1f17d"} Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.153298 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-f4p6b" event={"ID":"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2","Type":"ContainerStarted","Data":"8195176cc14091d4348369f982ac02c470ed4a9904f5022e56d5e17d86ab98c0"} Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.153365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-f4p6b" event={"ID":"2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2","Type":"ContainerStarted","Data":"297ea573ef2d1e891e689763717facb6e6d16e173fb6d188fd4dc15c650f8ef1"} Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.153430 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.178933 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-f4p6b" podStartSLOduration=3.178911878 podStartE2EDuration="3.178911878s" podCreationTimestamp="2026-03-13 12:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:04:05.174261374 +0000 UTC m=+1032.453914821" watchObservedRunningTime="2026-03-13 12:04:05.178911878 +0000 UTC m=+1032.458565335" Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.186018 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-jvntt"] Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.190538 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-jvntt"] Mar 13 12:04:05 crc kubenswrapper[4786]: I0313 12:04:05.471419 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7390aaf-5c5b-4eb5-8a09-e40a5591da5b" path="/var/lib/kubelet/pods/e7390aaf-5c5b-4eb5-8a09-e40a5591da5b/volumes" Mar 13 12:04:06 crc kubenswrapper[4786]: I0313 12:04:06.746437 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:06 crc kubenswrapper[4786]: I0313 12:04:06.755213 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9500808-ad3b-464f-ac49-ddb93b08f58e-memberlist\") pod \"speaker-gj86q\" (UID: \"c9500808-ad3b-464f-ac49-ddb93b08f58e\") " pod="metallb-system/speaker-gj86q" Mar 13 12:04:06 crc kubenswrapper[4786]: I0313 12:04:06.871785 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gj86q" Mar 13 12:04:06 crc kubenswrapper[4786]: W0313 12:04:06.905032 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9500808_ad3b_464f_ac49_ddb93b08f58e.slice/crio-04709365b078db30ca58e6124f3c684e3ab937514898bbba313fd30c2fe19cc6 WatchSource:0}: Error finding container 04709365b078db30ca58e6124f3c684e3ab937514898bbba313fd30c2fe19cc6: Status 404 returned error can't find the container with id 04709365b078db30ca58e6124f3c684e3ab937514898bbba313fd30c2fe19cc6 Mar 13 12:04:07 crc kubenswrapper[4786]: I0313 12:04:07.193139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gj86q" event={"ID":"c9500808-ad3b-464f-ac49-ddb93b08f58e","Type":"ContainerStarted","Data":"407aa2ca9cbe12ed07dc231075a12c4ec89b6cc5a6d8f8f6aaaf10da7077eb83"} Mar 13 12:04:07 crc kubenswrapper[4786]: I0313 12:04:07.193193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gj86q" event={"ID":"c9500808-ad3b-464f-ac49-ddb93b08f58e","Type":"ContainerStarted","Data":"04709365b078db30ca58e6124f3c684e3ab937514898bbba313fd30c2fe19cc6"} Mar 13 12:04:08 crc kubenswrapper[4786]: I0313 12:04:08.169214 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:04:08 crc kubenswrapper[4786]: I0313 12:04:08.180075 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:04:08 crc kubenswrapper[4786]: I0313 12:04:08.201750 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gj86q" event={"ID":"c9500808-ad3b-464f-ac49-ddb93b08f58e","Type":"ContainerStarted","Data":"ea2a5bbad28ed8f996821747b5cb117438e9c335d833c899515478708edf9e11"} Mar 13 12:04:08 crc kubenswrapper[4786]: I0313 12:04:08.202585 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gj86q" Mar 13 12:04:10 crc kubenswrapper[4786]: I0313 12:04:10.275225 4786 scope.go:117] "RemoveContainer" containerID="fb1823e793d40ba291d6da1843368104644d032300234a9ee48d8c92d161d6c5" Mar 13 12:04:11 crc kubenswrapper[4786]: I0313 12:04:11.234909 4786 generic.go:334] "Generic (PLEG): container finished" podID="62899598-48ee-4c70-8641-5f7defde9e8f" containerID="bf834c9257bf10e90c0612411a61ae20fa77876183fbf767187c4440dcb2a8dc" exitCode=0 Mar 13 12:04:11 crc kubenswrapper[4786]: I0313 12:04:11.234995 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerDied","Data":"bf834c9257bf10e90c0612411a61ae20fa77876183fbf767187c4440dcb2a8dc"} Mar 13 12:04:11 crc kubenswrapper[4786]: I0313 12:04:11.237305 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" event={"ID":"f0ef5780-741a-4adc-a453-fcf5f4a8813e","Type":"ContainerStarted","Data":"4d4addcc6c3aa5843529fef23af628aabf43f99e20ab9b99fbab1974aee03cd6"} Mar 13 12:04:11 crc kubenswrapper[4786]: I0313 12:04:11.237559 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:11 crc kubenswrapper[4786]: I0313 12:04:11.260776 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gj86q" podStartSLOduration=9.260753078 podStartE2EDuration="9.260753078s" podCreationTimestamp="2026-03-13 12:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:04:08.221703142 +0000 UTC m=+1035.501356599" watchObservedRunningTime="2026-03-13 12:04:11.260753078 +0000 UTC m=+1038.540406545" Mar 13 12:04:11 crc kubenswrapper[4786]: I0313 12:04:11.281439 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" podStartSLOduration=1.897754046 podStartE2EDuration="9.281417941s" podCreationTimestamp="2026-03-13 12:04:02 +0000 UTC" firstStartedPulling="2026-03-13 12:04:03.392905699 +0000 UTC m=+1030.672559136" lastFinishedPulling="2026-03-13 12:04:10.776569584 +0000 UTC m=+1038.056223031" observedRunningTime="2026-03-13 12:04:11.279293135 +0000 UTC m=+1038.558946642" watchObservedRunningTime="2026-03-13 12:04:11.281417941 +0000 UTC m=+1038.561071388" Mar 13 12:04:12 crc kubenswrapper[4786]: I0313 12:04:12.244103 4786 generic.go:334] "Generic (PLEG): container finished" podID="62899598-48ee-4c70-8641-5f7defde9e8f" containerID="cf227ad0d0d4e0cfda0d6beecd483fec4cf387867d739b4aa17f1c0796c8b826" exitCode=0 Mar 13 12:04:12 crc kubenswrapper[4786]: I0313 12:04:12.244232 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerDied","Data":"cf227ad0d0d4e0cfda0d6beecd483fec4cf387867d739b4aa17f1c0796c8b826"} Mar 13 12:04:13 crc kubenswrapper[4786]: I0313 12:04:13.259959 4786 generic.go:334] "Generic (PLEG): container finished" podID="62899598-48ee-4c70-8641-5f7defde9e8f" containerID="ab9c9b7969f53783d56bbe015dc192fd69e1556343ac0ee9d63c9925c20c4d07" exitCode=0 Mar 13 12:04:13 crc kubenswrapper[4786]: I0313 12:04:13.260018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerDied","Data":"ab9c9b7969f53783d56bbe015dc192fd69e1556343ac0ee9d63c9925c20c4d07"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.194504 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-f4p6b" Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.274528 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"da9e371d496a5ba59a6902f23ab0ef6dfed4b99b6d424cabd94b4937e0d20acc"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.275360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"6cac52f319a8a9515c96ed206fe60e3d10407be048ee4dce5ee5785a762e081c"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.275378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"327337f2ebc170f5b7f285ad44cd0616a0e5dee449c5a4f335b407c3f910515e"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.275390 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.275399 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"bec4e72746053b02b0484fb224235686427bb99e43bf250a0640b8a76f6fd6c8"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.275407 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"9c7122d4868cecd5a4bd9d26f8d8a4514c54da8b8ab4b449fc141dd757c2cdf2"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.275415 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mz2wz" event={"ID":"62899598-48ee-4c70-8641-5f7defde9e8f","Type":"ContainerStarted","Data":"509d75156d8ad5485d3f6da79f1f8c3bd576f33e7b78a425845f796b4db68fdc"} Mar 13 12:04:14 crc kubenswrapper[4786]: I0313 12:04:14.297080 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mz2wz" podStartSLOduration=4.876738423 podStartE2EDuration="12.297057809s" podCreationTimestamp="2026-03-13 12:04:02 +0000 UTC" firstStartedPulling="2026-03-13 12:04:03.341262086 +0000 UTC m=+1030.620915533" lastFinishedPulling="2026-03-13 12:04:10.761581452 +0000 UTC m=+1038.041234919" observedRunningTime="2026-03-13 12:04:14.293785032 +0000 UTC m=+1041.573438489" watchObservedRunningTime="2026-03-13 12:04:14.297057809 +0000 UTC m=+1041.576711266" Mar 13 12:04:18 crc kubenswrapper[4786]: I0313 12:04:18.193991 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:18 crc kubenswrapper[4786]: I0313 12:04:18.268173 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:23 crc kubenswrapper[4786]: I0313 12:04:23.186155 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pxszh" Mar 13 12:04:23 crc kubenswrapper[4786]: I0313 12:04:23.200051 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mz2wz" Mar 13 12:04:26 crc kubenswrapper[4786]: I0313 12:04:26.878217 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gj86q" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.225452 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd"] Mar 13 12:04:28 crc kubenswrapper[4786]: E0313 12:04:28.225776 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a1d014-b7f1-4009-942d-3a4794a8f675" containerName="oc" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.225795 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a1d014-b7f1-4009-942d-3a4794a8f675" containerName="oc" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.226013 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a1d014-b7f1-4009-942d-3a4794a8f675" containerName="oc" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.227274 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.230823 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.236699 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd"] Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.290393 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnr5j\" (UniqueName: \"kubernetes.io/projected/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-kube-api-access-tnr5j\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.290458 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.290491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.392315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnr5j\" (UniqueName: \"kubernetes.io/projected/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-kube-api-access-tnr5j\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.392384 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.392417 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.393129 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.393230 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.414014 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnr5j\" (UniqueName: \"kubernetes.io/projected/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-kube-api-access-tnr5j\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.556368 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:28 crc kubenswrapper[4786]: I0313 12:04:28.982949 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd"] Mar 13 12:04:28 crc kubenswrapper[4786]: W0313 12:04:28.990520 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23df2d8e_3fd0_4358_a8d0_4e9f65c28abd.slice/crio-2eb5868c0c8350706ca8f1a46aa667627cd99b847d7a7ff2db65adc61284631c WatchSource:0}: Error finding container 2eb5868c0c8350706ca8f1a46aa667627cd99b847d7a7ff2db65adc61284631c: Status 404 returned error can't find the container with id 2eb5868c0c8350706ca8f1a46aa667627cd99b847d7a7ff2db65adc61284631c Mar 13 12:04:29 crc kubenswrapper[4786]: I0313 12:04:29.405993 4786 generic.go:334] "Generic (PLEG): container finished" podID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerID="809b38a7d1f6df9b4f047191aaa3fe85032cbe44f100752bb1a17a5742d4c5de" exitCode=0 Mar 13 12:04:29 crc kubenswrapper[4786]: I0313 12:04:29.406043 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" event={"ID":"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd","Type":"ContainerDied","Data":"809b38a7d1f6df9b4f047191aaa3fe85032cbe44f100752bb1a17a5742d4c5de"} Mar 13 12:04:29 crc kubenswrapper[4786]: I0313 12:04:29.406070 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" event={"ID":"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd","Type":"ContainerStarted","Data":"2eb5868c0c8350706ca8f1a46aa667627cd99b847d7a7ff2db65adc61284631c"} Mar 13 12:04:34 crc kubenswrapper[4786]: I0313 12:04:34.446070 4786 generic.go:334] "Generic (PLEG): container finished" podID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerID="ada9bef53f2f85f851eba6265f07862088bdc4ddac1b5dbd6dc6727c13400fdb" exitCode=0 Mar 13 12:04:34 crc kubenswrapper[4786]: I0313 12:04:34.446190 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" event={"ID":"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd","Type":"ContainerDied","Data":"ada9bef53f2f85f851eba6265f07862088bdc4ddac1b5dbd6dc6727c13400fdb"} Mar 13 12:04:35 crc kubenswrapper[4786]: I0313 12:04:35.455118 4786 generic.go:334] "Generic (PLEG): container finished" podID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerID="34fbb89e662bf4035e273b8ec1869ac7eeccd679a0c5f920655dd6afc8a7df58" exitCode=0 Mar 13 12:04:35 crc kubenswrapper[4786]: I0313 12:04:35.455169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" event={"ID":"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd","Type":"ContainerDied","Data":"34fbb89e662bf4035e273b8ec1869ac7eeccd679a0c5f920655dd6afc8a7df58"} Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.776611 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.831470 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnr5j\" (UniqueName: \"kubernetes.io/projected/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-kube-api-access-tnr5j\") pod \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.831526 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-util\") pod \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.831688 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-bundle\") pod \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\" (UID: \"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd\") " Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.833162 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-bundle" (OuterVolumeSpecName: "bundle") pod "23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" (UID: "23df2d8e-3fd0-4358-a8d0-4e9f65c28abd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.837844 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-kube-api-access-tnr5j" (OuterVolumeSpecName: "kube-api-access-tnr5j") pod "23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" (UID: "23df2d8e-3fd0-4358-a8d0-4e9f65c28abd"). InnerVolumeSpecName "kube-api-access-tnr5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.850860 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-util" (OuterVolumeSpecName: "util") pod "23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" (UID: "23df2d8e-3fd0-4358-a8d0-4e9f65c28abd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.934102 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnr5j\" (UniqueName: \"kubernetes.io/projected/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-kube-api-access-tnr5j\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.934178 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4786]: I0313 12:04:36.934216 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23df2d8e-3fd0-4358-a8d0-4e9f65c28abd-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:37 crc kubenswrapper[4786]: I0313 12:04:37.475929 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" event={"ID":"23df2d8e-3fd0-4358-a8d0-4e9f65c28abd","Type":"ContainerDied","Data":"2eb5868c0c8350706ca8f1a46aa667627cd99b847d7a7ff2db65adc61284631c"} Mar 13 12:04:37 crc kubenswrapper[4786]: I0313 12:04:37.476001 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd" Mar 13 12:04:37 crc kubenswrapper[4786]: I0313 12:04:37.476005 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb5868c0c8350706ca8f1a46aa667627cd99b847d7a7ff2db65adc61284631c" Mar 13 12:04:38 crc kubenswrapper[4786]: I0313 12:04:38.169744 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:04:38 crc kubenswrapper[4786]: I0313 12:04:38.169922 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.089119 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh"] Mar 13 12:04:41 crc kubenswrapper[4786]: E0313 12:04:41.089928 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="extract" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.089944 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="extract" Mar 13 12:04:41 crc kubenswrapper[4786]: E0313 12:04:41.089971 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="util" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.089979 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="util" Mar 13 12:04:41 crc kubenswrapper[4786]: E0313 12:04:41.089992 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="pull" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.090000 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="pull" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.090130 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="23df2d8e-3fd0-4358-a8d0-4e9f65c28abd" containerName="extract" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.090602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.092992 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.093375 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-4ftsr" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.096289 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.105984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh"] Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.193087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmffl\" (UniqueName: \"kubernetes.io/projected/c9420f84-e705-40dc-8bd9-d62239d4fb42-kube-api-access-rmffl\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g9zkh\" (UID: \"c9420f84-e705-40dc-8bd9-d62239d4fb42\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.193176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9420f84-e705-40dc-8bd9-d62239d4fb42-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g9zkh\" (UID: \"c9420f84-e705-40dc-8bd9-d62239d4fb42\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.294091 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmffl\" (UniqueName: \"kubernetes.io/projected/c9420f84-e705-40dc-8bd9-d62239d4fb42-kube-api-access-rmffl\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g9zkh\" (UID: \"c9420f84-e705-40dc-8bd9-d62239d4fb42\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.294175 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9420f84-e705-40dc-8bd9-d62239d4fb42-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g9zkh\" (UID: \"c9420f84-e705-40dc-8bd9-d62239d4fb42\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.294853 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9420f84-e705-40dc-8bd9-d62239d4fb42-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g9zkh\" (UID: \"c9420f84-e705-40dc-8bd9-d62239d4fb42\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.311698 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmffl\" (UniqueName: \"kubernetes.io/projected/c9420f84-e705-40dc-8bd9-d62239d4fb42-kube-api-access-rmffl\") pod \"cert-manager-operator-controller-manager-66c8bdd694-g9zkh\" (UID: \"c9420f84-e705-40dc-8bd9-d62239d4fb42\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.430927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" Mar 13 12:04:41 crc kubenswrapper[4786]: I0313 12:04:41.851960 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh"] Mar 13 12:04:42 crc kubenswrapper[4786]: I0313 12:04:42.506966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" event={"ID":"c9420f84-e705-40dc-8bd9-d62239d4fb42","Type":"ContainerStarted","Data":"3be67441ed0b6227cbcd69076213f185c2154ed758ef907a434e3a97b343b2c1"} Mar 13 12:04:46 crc kubenswrapper[4786]: I0313 12:04:46.535401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" event={"ID":"c9420f84-e705-40dc-8bd9-d62239d4fb42","Type":"ContainerStarted","Data":"6c6cbb80165536474fac3c12bb895038bf2aad22bb1e3f5e0ae8b3182af9ab7b"} Mar 13 12:04:46 crc kubenswrapper[4786]: I0313 12:04:46.557699 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-g9zkh" podStartSLOduration=1.500973849 podStartE2EDuration="5.557685093s" podCreationTimestamp="2026-03-13 12:04:41 +0000 UTC" firstStartedPulling="2026-03-13 12:04:41.858713028 +0000 UTC m=+1069.138366475" lastFinishedPulling="2026-03-13 12:04:45.915424272 +0000 UTC m=+1073.195077719" observedRunningTime="2026-03-13 12:04:46.555938476 +0000 UTC m=+1073.835591943" watchObservedRunningTime="2026-03-13 12:04:46.557685093 +0000 UTC m=+1073.837338530" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.211820 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-lw7q8"] Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.212717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.215036 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-s9pwn" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.215410 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.215623 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.224862 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-lw7q8"] Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.296070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20a46b3c-810e-4c07-974e-42bf0a40efc1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-lw7q8\" (UID: \"20a46b3c-810e-4c07-974e-42bf0a40efc1\") " pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.296528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw5s7\" (UniqueName: \"kubernetes.io/projected/20a46b3c-810e-4c07-974e-42bf0a40efc1-kube-api-access-zw5s7\") pod \"cert-manager-webhook-6888856db4-lw7q8\" (UID: \"20a46b3c-810e-4c07-974e-42bf0a40efc1\") " pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.398527 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw5s7\" (UniqueName: \"kubernetes.io/projected/20a46b3c-810e-4c07-974e-42bf0a40efc1-kube-api-access-zw5s7\") pod \"cert-manager-webhook-6888856db4-lw7q8\" (UID: \"20a46b3c-810e-4c07-974e-42bf0a40efc1\") " pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.398610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20a46b3c-810e-4c07-974e-42bf0a40efc1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-lw7q8\" (UID: \"20a46b3c-810e-4c07-974e-42bf0a40efc1\") " pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.440075 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw5s7\" (UniqueName: \"kubernetes.io/projected/20a46b3c-810e-4c07-974e-42bf0a40efc1-kube-api-access-zw5s7\") pod \"cert-manager-webhook-6888856db4-lw7q8\" (UID: \"20a46b3c-810e-4c07-974e-42bf0a40efc1\") " pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.440750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20a46b3c-810e-4c07-974e-42bf0a40efc1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-lw7q8\" (UID: \"20a46b3c-810e-4c07-974e-42bf0a40efc1\") " pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.526795 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:49 crc kubenswrapper[4786]: I0313 12:04:49.973764 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-lw7q8"] Mar 13 12:04:50 crc kubenswrapper[4786]: I0313 12:04:50.564534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" event={"ID":"20a46b3c-810e-4c07-974e-42bf0a40efc1","Type":"ContainerStarted","Data":"47ef5ccb9bd927cb99a59409ca0ad9c84380119e1831f60532b45c7f76cc79ab"} Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.744929 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-2xpth"] Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.746086 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.747936 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rjvvs" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.751540 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-2xpth"] Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.840850 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ea8859a-f224-44ba-b451-fdf4f6401cfc-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-2xpth\" (UID: \"5ea8859a-f224-44ba-b451-fdf4f6401cfc\") " pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.840919 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4z5g\" (UniqueName: \"kubernetes.io/projected/5ea8859a-f224-44ba-b451-fdf4f6401cfc-kube-api-access-w4z5g\") pod \"cert-manager-cainjector-5545bd876-2xpth\" (UID: \"5ea8859a-f224-44ba-b451-fdf4f6401cfc\") " pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.942744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4z5g\" (UniqueName: \"kubernetes.io/projected/5ea8859a-f224-44ba-b451-fdf4f6401cfc-kube-api-access-w4z5g\") pod \"cert-manager-cainjector-5545bd876-2xpth\" (UID: \"5ea8859a-f224-44ba-b451-fdf4f6401cfc\") " pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.942906 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ea8859a-f224-44ba-b451-fdf4f6401cfc-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-2xpth\" (UID: \"5ea8859a-f224-44ba-b451-fdf4f6401cfc\") " pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.967577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4z5g\" (UniqueName: \"kubernetes.io/projected/5ea8859a-f224-44ba-b451-fdf4f6401cfc-kube-api-access-w4z5g\") pod \"cert-manager-cainjector-5545bd876-2xpth\" (UID: \"5ea8859a-f224-44ba-b451-fdf4f6401cfc\") " pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:52 crc kubenswrapper[4786]: I0313 12:04:52.974207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ea8859a-f224-44ba-b451-fdf4f6401cfc-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-2xpth\" (UID: \"5ea8859a-f224-44ba-b451-fdf4f6401cfc\") " pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:53 crc kubenswrapper[4786]: I0313 12:04:53.064283 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.156540 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-2xpth"] Mar 13 12:04:54 crc kubenswrapper[4786]: W0313 12:04:54.160708 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ea8859a_f224_44ba_b451_fdf4f6401cfc.slice/crio-7ea63f7a14894ed502bbcae5d768dc8f31f5c8660951e95d8b39c41f4b9d847c WatchSource:0}: Error finding container 7ea63f7a14894ed502bbcae5d768dc8f31f5c8660951e95d8b39c41f4b9d847c: Status 404 returned error can't find the container with id 7ea63f7a14894ed502bbcae5d768dc8f31f5c8660951e95d8b39c41f4b9d847c Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.590786 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" event={"ID":"5ea8859a-f224-44ba-b451-fdf4f6401cfc","Type":"ContainerStarted","Data":"d2afb00a98fdf6de36a1171cb84c307ddbf7892d6c12460c37396d5a816fb302"} Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.590945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" event={"ID":"5ea8859a-f224-44ba-b451-fdf4f6401cfc","Type":"ContainerStarted","Data":"7ea63f7a14894ed502bbcae5d768dc8f31f5c8660951e95d8b39c41f4b9d847c"} Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.592360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" event={"ID":"20a46b3c-810e-4c07-974e-42bf0a40efc1","Type":"ContainerStarted","Data":"e082359e591fd371899209d2ffabc3d3442dc4f4c2caeda08e8fa8ca7d9939a9"} Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.592814 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.606969 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-2xpth" podStartSLOduration=2.606950773 podStartE2EDuration="2.606950773s" podCreationTimestamp="2026-03-13 12:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:04:54.603046598 +0000 UTC m=+1081.882700055" watchObservedRunningTime="2026-03-13 12:04:54.606950773 +0000 UTC m=+1081.886604220" Mar 13 12:04:54 crc kubenswrapper[4786]: I0313 12:04:54.630536 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" podStartSLOduration=1.581232429 podStartE2EDuration="5.630518255s" podCreationTimestamp="2026-03-13 12:04:49 +0000 UTC" firstStartedPulling="2026-03-13 12:04:49.974144971 +0000 UTC m=+1077.253798428" lastFinishedPulling="2026-03-13 12:04:54.023430807 +0000 UTC m=+1081.303084254" observedRunningTime="2026-03-13 12:04:54.628311055 +0000 UTC m=+1081.907964532" watchObservedRunningTime="2026-03-13 12:04:54.630518255 +0000 UTC m=+1081.910171702" Mar 13 12:04:59 crc kubenswrapper[4786]: I0313 12:04:59.531525 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-lw7q8" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.038138 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-dcm9k"] Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.039564 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.042861 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-g4bzt" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.053285 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-dcm9k"] Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.060018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxsvr\" (UniqueName: \"kubernetes.io/projected/4ab306ea-196b-40b7-b016-1f29d639935b-kube-api-access-rxsvr\") pod \"cert-manager-545d4d4674-dcm9k\" (UID: \"4ab306ea-196b-40b7-b016-1f29d639935b\") " pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.060191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ab306ea-196b-40b7-b016-1f29d639935b-bound-sa-token\") pod \"cert-manager-545d4d4674-dcm9k\" (UID: \"4ab306ea-196b-40b7-b016-1f29d639935b\") " pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.161467 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxsvr\" (UniqueName: \"kubernetes.io/projected/4ab306ea-196b-40b7-b016-1f29d639935b-kube-api-access-rxsvr\") pod \"cert-manager-545d4d4674-dcm9k\" (UID: \"4ab306ea-196b-40b7-b016-1f29d639935b\") " pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.161541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ab306ea-196b-40b7-b016-1f29d639935b-bound-sa-token\") pod \"cert-manager-545d4d4674-dcm9k\" (UID: \"4ab306ea-196b-40b7-b016-1f29d639935b\") " pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.169835 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.169932 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.169986 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.170650 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cee9aff52905686331ac0d49b868be713596890b00b0633ce66e8cdee6b5f0de"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.170754 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://cee9aff52905686331ac0d49b868be713596890b00b0633ce66e8cdee6b5f0de" gracePeriod=600 Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.186960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ab306ea-196b-40b7-b016-1f29d639935b-bound-sa-token\") pod \"cert-manager-545d4d4674-dcm9k\" (UID: \"4ab306ea-196b-40b7-b016-1f29d639935b\") " pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.188919 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxsvr\" (UniqueName: \"kubernetes.io/projected/4ab306ea-196b-40b7-b016-1f29d639935b-kube-api-access-rxsvr\") pod \"cert-manager-545d4d4674-dcm9k\" (UID: \"4ab306ea-196b-40b7-b016-1f29d639935b\") " pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.363230 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-dcm9k" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.696785 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="cee9aff52905686331ac0d49b868be713596890b00b0633ce66e8cdee6b5f0de" exitCode=0 Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.696819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"cee9aff52905686331ac0d49b868be713596890b00b0633ce66e8cdee6b5f0de"} Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.697223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"5656b6c6cc644913041fc5892205e2cc6f507fb238f0bcbc7956307710968e91"} Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.697246 4786 scope.go:117] "RemoveContainer" containerID="53f9a9165f399ca75a5c5e665434b0714c4c497324b97b5da97227bbf25aa5b5" Mar 13 12:05:08 crc kubenswrapper[4786]: I0313 12:05:08.794813 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-dcm9k"] Mar 13 12:05:08 crc kubenswrapper[4786]: W0313 12:05:08.795694 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab306ea_196b_40b7_b016_1f29d639935b.slice/crio-f338a242a8116237f3d6a55d317c1f1dc30d4ec765af8ebf2439462a212cef2b WatchSource:0}: Error finding container f338a242a8116237f3d6a55d317c1f1dc30d4ec765af8ebf2439462a212cef2b: Status 404 returned error can't find the container with id f338a242a8116237f3d6a55d317c1f1dc30d4ec765af8ebf2439462a212cef2b Mar 13 12:05:09 crc kubenswrapper[4786]: I0313 12:05:09.711815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-dcm9k" event={"ID":"4ab306ea-196b-40b7-b016-1f29d639935b","Type":"ContainerStarted","Data":"0ebbdb1d261f84b42166d716f4adcfb6667a43c7fef5c823b1776cd005bcebf5"} Mar 13 12:05:09 crc kubenswrapper[4786]: I0313 12:05:09.711872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-dcm9k" event={"ID":"4ab306ea-196b-40b7-b016-1f29d639935b","Type":"ContainerStarted","Data":"f338a242a8116237f3d6a55d317c1f1dc30d4ec765af8ebf2439462a212cef2b"} Mar 13 12:05:09 crc kubenswrapper[4786]: I0313 12:05:09.748032 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-dcm9k" podStartSLOduration=1.747858802 podStartE2EDuration="1.747858802s" podCreationTimestamp="2026-03-13 12:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:09.743917147 +0000 UTC m=+1097.023570624" watchObservedRunningTime="2026-03-13 12:05:09.747858802 +0000 UTC m=+1097.027512289" Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.803386 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jhshw"] Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.804857 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.808349 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-c5jh2" Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.809452 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.810119 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.829653 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jhshw"] Mar 13 12:05:12 crc kubenswrapper[4786]: I0313 12:05:12.932724 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2gw\" (UniqueName: \"kubernetes.io/projected/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97-kube-api-access-dp2gw\") pod \"openstack-operator-index-jhshw\" (UID: \"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97\") " pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:13 crc kubenswrapper[4786]: I0313 12:05:13.034187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2gw\" (UniqueName: \"kubernetes.io/projected/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97-kube-api-access-dp2gw\") pod \"openstack-operator-index-jhshw\" (UID: \"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97\") " pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:13 crc kubenswrapper[4786]: I0313 12:05:13.051478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2gw\" (UniqueName: \"kubernetes.io/projected/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97-kube-api-access-dp2gw\") pod \"openstack-operator-index-jhshw\" (UID: \"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97\") " pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:13 crc kubenswrapper[4786]: I0313 12:05:13.137478 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:13 crc kubenswrapper[4786]: I0313 12:05:13.594600 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jhshw"] Mar 13 12:05:13 crc kubenswrapper[4786]: W0313 12:05:13.599900 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode89c6aa4_f637_4d5b_8cb3_a1c8a9d95d97.slice/crio-5892b464f473eac1a1a267fe89e3b62e0107f5b34bc57dd11c3f690866018c1e WatchSource:0}: Error finding container 5892b464f473eac1a1a267fe89e3b62e0107f5b34bc57dd11c3f690866018c1e: Status 404 returned error can't find the container with id 5892b464f473eac1a1a267fe89e3b62e0107f5b34bc57dd11c3f690866018c1e Mar 13 12:05:13 crc kubenswrapper[4786]: I0313 12:05:13.743002 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jhshw" event={"ID":"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97","Type":"ContainerStarted","Data":"5892b464f473eac1a1a267fe89e3b62e0107f5b34bc57dd11c3f690866018c1e"} Mar 13 12:05:15 crc kubenswrapper[4786]: I0313 12:05:15.759549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jhshw" event={"ID":"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97","Type":"ContainerStarted","Data":"11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554"} Mar 13 12:05:15 crc kubenswrapper[4786]: I0313 12:05:15.787827 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jhshw" podStartSLOduration=2.781571407 podStartE2EDuration="3.787797648s" podCreationTimestamp="2026-03-13 12:05:12 +0000 UTC" firstStartedPulling="2026-03-13 12:05:13.602597412 +0000 UTC m=+1100.882250869" lastFinishedPulling="2026-03-13 12:05:14.608823663 +0000 UTC m=+1101.888477110" observedRunningTime="2026-03-13 12:05:15.784811388 +0000 UTC m=+1103.064464925" watchObservedRunningTime="2026-03-13 12:05:15.787797648 +0000 UTC m=+1103.067451135" Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.166247 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jhshw"] Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.769851 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-g8xgg"] Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.770593 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.778453 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g8xgg"] Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.790174 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55v9j\" (UniqueName: \"kubernetes.io/projected/c133d0bb-ca55-4518-8150-5f2e1ab0dbe3-kube-api-access-55v9j\") pod \"openstack-operator-index-g8xgg\" (UID: \"c133d0bb-ca55-4518-8150-5f2e1ab0dbe3\") " pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.890611 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55v9j\" (UniqueName: \"kubernetes.io/projected/c133d0bb-ca55-4518-8150-5f2e1ab0dbe3-kube-api-access-55v9j\") pod \"openstack-operator-index-g8xgg\" (UID: \"c133d0bb-ca55-4518-8150-5f2e1ab0dbe3\") " pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:16 crc kubenswrapper[4786]: I0313 12:05:16.934436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55v9j\" (UniqueName: \"kubernetes.io/projected/c133d0bb-ca55-4518-8150-5f2e1ab0dbe3-kube-api-access-55v9j\") pod \"openstack-operator-index-g8xgg\" (UID: \"c133d0bb-ca55-4518-8150-5f2e1ab0dbe3\") " pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:17 crc kubenswrapper[4786]: I0313 12:05:17.093473 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:17 crc kubenswrapper[4786]: I0313 12:05:17.590570 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g8xgg"] Mar 13 12:05:17 crc kubenswrapper[4786]: W0313 12:05:17.601853 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc133d0bb_ca55_4518_8150_5f2e1ab0dbe3.slice/crio-e3a97a47938807f0ace179f428b23d3f71f3deff8fe6c3a45146d626be116b3b WatchSource:0}: Error finding container e3a97a47938807f0ace179f428b23d3f71f3deff8fe6c3a45146d626be116b3b: Status 404 returned error can't find the container with id e3a97a47938807f0ace179f428b23d3f71f3deff8fe6c3a45146d626be116b3b Mar 13 12:05:17 crc kubenswrapper[4786]: I0313 12:05:17.778157 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g8xgg" event={"ID":"c133d0bb-ca55-4518-8150-5f2e1ab0dbe3","Type":"ContainerStarted","Data":"e3a97a47938807f0ace179f428b23d3f71f3deff8fe6c3a45146d626be116b3b"} Mar 13 12:05:17 crc kubenswrapper[4786]: I0313 12:05:17.778322 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jhshw" podUID="e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" containerName="registry-server" containerID="cri-o://11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554" gracePeriod=2 Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.239100 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.413203 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp2gw\" (UniqueName: \"kubernetes.io/projected/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97-kube-api-access-dp2gw\") pod \"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97\" (UID: \"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97\") " Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.418359 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97-kube-api-access-dp2gw" (OuterVolumeSpecName: "kube-api-access-dp2gw") pod "e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" (UID: "e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97"). InnerVolumeSpecName "kube-api-access-dp2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.515518 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp2gw\" (UniqueName: \"kubernetes.io/projected/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97-kube-api-access-dp2gw\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.790675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g8xgg" event={"ID":"c133d0bb-ca55-4518-8150-5f2e1ab0dbe3","Type":"ContainerStarted","Data":"f64240a9b16826c0d28f57277c2f5208129ae66daaa55a09adb70bf252d8fd8c"} Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.795800 4786 generic.go:334] "Generic (PLEG): container finished" podID="e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" containerID="11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554" exitCode=0 Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.795866 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jhshw" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.795880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jhshw" event={"ID":"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97","Type":"ContainerDied","Data":"11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554"} Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.795979 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jhshw" event={"ID":"e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97","Type":"ContainerDied","Data":"5892b464f473eac1a1a267fe89e3b62e0107f5b34bc57dd11c3f690866018c1e"} Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.796003 4786 scope.go:117] "RemoveContainer" containerID="11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.828015 4786 scope.go:117] "RemoveContainer" containerID="11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.828449 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-g8xgg" podStartSLOduration=2.37742151 podStartE2EDuration="2.828428818s" podCreationTimestamp="2026-03-13 12:05:16 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.606082436 +0000 UTC m=+1104.885735923" lastFinishedPulling="2026-03-13 12:05:18.057089774 +0000 UTC m=+1105.336743231" observedRunningTime="2026-03-13 12:05:18.819912949 +0000 UTC m=+1106.099566436" watchObservedRunningTime="2026-03-13 12:05:18.828428818 +0000 UTC m=+1106.108082305" Mar 13 12:05:18 crc kubenswrapper[4786]: E0313 12:05:18.829329 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554\": container with ID starting with 11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554 not found: ID does not exist" containerID="11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.829382 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554"} err="failed to get container status \"11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554\": rpc error: code = NotFound desc = could not find container \"11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554\": container with ID starting with 11bb0315fc1c91c5250d8f67d3813812deea89e4be2ac6e9b9d38fdfdfce8554 not found: ID does not exist" Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.846123 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jhshw"] Mar 13 12:05:18 crc kubenswrapper[4786]: I0313 12:05:18.851051 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jhshw"] Mar 13 12:05:19 crc kubenswrapper[4786]: I0313 12:05:19.454100 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" path="/var/lib/kubelet/pods/e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97/volumes" Mar 13 12:05:27 crc kubenswrapper[4786]: I0313 12:05:27.094758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:27 crc kubenswrapper[4786]: I0313 12:05:27.095652 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:27 crc kubenswrapper[4786]: I0313 12:05:27.143364 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:27 crc kubenswrapper[4786]: I0313 12:05:27.916069 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-g8xgg" Mar 13 12:05:33 crc kubenswrapper[4786]: I0313 12:05:33.965533 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw"] Mar 13 12:05:33 crc kubenswrapper[4786]: E0313 12:05:33.966233 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" containerName="registry-server" Mar 13 12:05:33 crc kubenswrapper[4786]: I0313 12:05:33.966254 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" containerName="registry-server" Mar 13 12:05:33 crc kubenswrapper[4786]: I0313 12:05:33.966425 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89c6aa4-f637-4d5b-8cb3-a1c8a9d95d97" containerName="registry-server" Mar 13 12:05:33 crc kubenswrapper[4786]: I0313 12:05:33.967693 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:33 crc kubenswrapper[4786]: I0313 12:05:33.970226 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-f4kv2" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:33.987469 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw"] Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.140830 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.140931 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.141037 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/03cba3c5-ae6a-4348-9c80-f38790f5b763-kube-api-access-shmcj\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.242321 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.242420 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/03cba3c5-ae6a-4348-9c80-f38790f5b763-kube-api-access-shmcj\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.242558 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.243455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.243481 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.279538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/03cba3c5-ae6a-4348-9c80-f38790f5b763-kube-api-access-shmcj\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.315575 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.567332 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw"] Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.933288 4786 generic.go:334] "Generic (PLEG): container finished" podID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerID="5baf2f8b7eeee6a539f8a563e28373b86b03a7971cf24d58665a9c2f40f9f7bf" exitCode=0 Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.933387 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" event={"ID":"03cba3c5-ae6a-4348-9c80-f38790f5b763","Type":"ContainerDied","Data":"5baf2f8b7eeee6a539f8a563e28373b86b03a7971cf24d58665a9c2f40f9f7bf"} Mar 13 12:05:34 crc kubenswrapper[4786]: I0313 12:05:34.933597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" event={"ID":"03cba3c5-ae6a-4348-9c80-f38790f5b763","Type":"ContainerStarted","Data":"22211549ad4af67f60740d7b73f8f8d02d6ed1dfe8787b0b005cd76ab34f52b8"} Mar 13 12:05:35 crc kubenswrapper[4786]: I0313 12:05:35.944258 4786 generic.go:334] "Generic (PLEG): container finished" podID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerID="c52f41da6c72d990cb34c43771672eda2cefb9655006a80d40ce4bb8c0faace0" exitCode=0 Mar 13 12:05:35 crc kubenswrapper[4786]: I0313 12:05:35.944329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" event={"ID":"03cba3c5-ae6a-4348-9c80-f38790f5b763","Type":"ContainerDied","Data":"c52f41da6c72d990cb34c43771672eda2cefb9655006a80d40ce4bb8c0faace0"} Mar 13 12:05:36 crc kubenswrapper[4786]: I0313 12:05:36.955084 4786 generic.go:334] "Generic (PLEG): container finished" podID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerID="00a28797c6bf374deddc7537c57c0467de01a38997889fd9fa8dde57c2fbf016" exitCode=0 Mar 13 12:05:36 crc kubenswrapper[4786]: I0313 12:05:36.955173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" event={"ID":"03cba3c5-ae6a-4348-9c80-f38790f5b763","Type":"ContainerDied","Data":"00a28797c6bf374deddc7537c57c0467de01a38997889fd9fa8dde57c2fbf016"} Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.145342 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.249229 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-util\") pod \"03cba3c5-ae6a-4348-9c80-f38790f5b763\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.249323 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-bundle\") pod \"03cba3c5-ae6a-4348-9c80-f38790f5b763\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.249400 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/03cba3c5-ae6a-4348-9c80-f38790f5b763-kube-api-access-shmcj\") pod \"03cba3c5-ae6a-4348-9c80-f38790f5b763\" (UID: \"03cba3c5-ae6a-4348-9c80-f38790f5b763\") " Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.250115 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-bundle" (OuterVolumeSpecName: "bundle") pod "03cba3c5-ae6a-4348-9c80-f38790f5b763" (UID: "03cba3c5-ae6a-4348-9c80-f38790f5b763"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.255116 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03cba3c5-ae6a-4348-9c80-f38790f5b763-kube-api-access-shmcj" (OuterVolumeSpecName: "kube-api-access-shmcj") pod "03cba3c5-ae6a-4348-9c80-f38790f5b763" (UID: "03cba3c5-ae6a-4348-9c80-f38790f5b763"). InnerVolumeSpecName "kube-api-access-shmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.276087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-util" (OuterVolumeSpecName: "util") pod "03cba3c5-ae6a-4348-9c80-f38790f5b763" (UID: "03cba3c5-ae6a-4348-9c80-f38790f5b763"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.350441 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.350483 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/03cba3c5-ae6a-4348-9c80-f38790f5b763-kube-api-access-shmcj\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.350498 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03cba3c5-ae6a-4348-9c80-f38790f5b763-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.982923 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" event={"ID":"03cba3c5-ae6a-4348-9c80-f38790f5b763","Type":"ContainerDied","Data":"22211549ad4af67f60740d7b73f8f8d02d6ed1dfe8787b0b005cd76ab34f52b8"} Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.983022 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22211549ad4af67f60740d7b73f8f8d02d6ed1dfe8787b0b005cd76ab34f52b8" Mar 13 12:05:39 crc kubenswrapper[4786]: I0313 12:05:39.983042 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.941582 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc"] Mar 13 12:05:45 crc kubenswrapper[4786]: E0313 12:05:45.942646 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="extract" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.942664 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="extract" Mar 13 12:05:45 crc kubenswrapper[4786]: E0313 12:05:45.942680 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="util" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.942690 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="util" Mar 13 12:05:45 crc kubenswrapper[4786]: E0313 12:05:45.942717 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="pull" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.942726 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="pull" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.943632 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="03cba3c5-ae6a-4348-9c80-f38790f5b763" containerName="extract" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.945547 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.965378 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vgvr8" Mar 13 12:05:45 crc kubenswrapper[4786]: I0313 12:05:45.978203 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc"] Mar 13 12:05:46 crc kubenswrapper[4786]: I0313 12:05:46.048684 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdtkp\" (UniqueName: \"kubernetes.io/projected/f0b720fc-612f-48bb-9681-9fc6c6b102f4-kube-api-access-vdtkp\") pod \"openstack-operator-controller-init-65b9994cf8-cgmkc\" (UID: \"f0b720fc-612f-48bb-9681-9fc6c6b102f4\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:46 crc kubenswrapper[4786]: I0313 12:05:46.150232 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdtkp\" (UniqueName: \"kubernetes.io/projected/f0b720fc-612f-48bb-9681-9fc6c6b102f4-kube-api-access-vdtkp\") pod \"openstack-operator-controller-init-65b9994cf8-cgmkc\" (UID: \"f0b720fc-612f-48bb-9681-9fc6c6b102f4\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:46 crc kubenswrapper[4786]: I0313 12:05:46.177154 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdtkp\" (UniqueName: \"kubernetes.io/projected/f0b720fc-612f-48bb-9681-9fc6c6b102f4-kube-api-access-vdtkp\") pod \"openstack-operator-controller-init-65b9994cf8-cgmkc\" (UID: \"f0b720fc-612f-48bb-9681-9fc6c6b102f4\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:46 crc kubenswrapper[4786]: I0313 12:05:46.270528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:46 crc kubenswrapper[4786]: I0313 12:05:46.477712 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc"] Mar 13 12:05:46 crc kubenswrapper[4786]: W0313 12:05:46.486247 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0b720fc_612f_48bb_9681_9fc6c6b102f4.slice/crio-a49a19f9bef4aa1a8187b5452a494f7294f98c375905f8169a718343d3f2268c WatchSource:0}: Error finding container a49a19f9bef4aa1a8187b5452a494f7294f98c375905f8169a718343d3f2268c: Status 404 returned error can't find the container with id a49a19f9bef4aa1a8187b5452a494f7294f98c375905f8169a718343d3f2268c Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.025010 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" event={"ID":"f0b720fc-612f-48bb-9681-9fc6c6b102f4","Type":"ContainerStarted","Data":"a49a19f9bef4aa1a8187b5452a494f7294f98c375905f8169a718343d3f2268c"} Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.076614 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2bbkh"] Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.077954 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.086761 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bbkh"] Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.162953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtgsr\" (UniqueName: \"kubernetes.io/projected/bd479800-84dd-48a6-96f5-851b183494b4-kube-api-access-gtgsr\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.163027 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-utilities\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.163066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-catalog-content\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.264691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtgsr\" (UniqueName: \"kubernetes.io/projected/bd479800-84dd-48a6-96f5-851b183494b4-kube-api-access-gtgsr\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.264759 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-utilities\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.264795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-catalog-content\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.265392 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-catalog-content\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.265392 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-utilities\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.305142 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtgsr\" (UniqueName: \"kubernetes.io/projected/bd479800-84dd-48a6-96f5-851b183494b4-kube-api-access-gtgsr\") pod \"redhat-marketplace-2bbkh\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.403157 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:47 crc kubenswrapper[4786]: I0313 12:05:47.762087 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bbkh"] Mar 13 12:05:48 crc kubenswrapper[4786]: I0313 12:05:48.036324 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd479800-84dd-48a6-96f5-851b183494b4" containerID="533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3" exitCode=0 Mar 13 12:05:48 crc kubenswrapper[4786]: I0313 12:05:48.037372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bbkh" event={"ID":"bd479800-84dd-48a6-96f5-851b183494b4","Type":"ContainerDied","Data":"533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3"} Mar 13 12:05:48 crc kubenswrapper[4786]: I0313 12:05:48.037409 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bbkh" event={"ID":"bd479800-84dd-48a6-96f5-851b183494b4","Type":"ContainerStarted","Data":"40fc84aefd6ee0cbafdde444edc59705477c1554e2a10939db6e38109b5e312e"} Mar 13 12:05:52 crc kubenswrapper[4786]: I0313 12:05:52.061505 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd479800-84dd-48a6-96f5-851b183494b4" containerID="3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31" exitCode=0 Mar 13 12:05:52 crc kubenswrapper[4786]: I0313 12:05:52.061570 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bbkh" event={"ID":"bd479800-84dd-48a6-96f5-851b183494b4","Type":"ContainerDied","Data":"3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31"} Mar 13 12:05:52 crc kubenswrapper[4786]: I0313 12:05:52.065477 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" event={"ID":"f0b720fc-612f-48bb-9681-9fc6c6b102f4","Type":"ContainerStarted","Data":"b4caf6eeb2f23162a54e9b2935b212bbbab7b059f426106b8ffed0513de823d2"} Mar 13 12:05:52 crc kubenswrapper[4786]: I0313 12:05:52.065959 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:52 crc kubenswrapper[4786]: I0313 12:05:52.144625 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" podStartSLOduration=2.318693137 podStartE2EDuration="7.144599029s" podCreationTimestamp="2026-03-13 12:05:45 +0000 UTC" firstStartedPulling="2026-03-13 12:05:46.488135402 +0000 UTC m=+1133.767788839" lastFinishedPulling="2026-03-13 12:05:51.314041284 +0000 UTC m=+1138.593694731" observedRunningTime="2026-03-13 12:05:52.143037418 +0000 UTC m=+1139.422690905" watchObservedRunningTime="2026-03-13 12:05:52.144599029 +0000 UTC m=+1139.424252536" Mar 13 12:05:53 crc kubenswrapper[4786]: I0313 12:05:53.076773 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bbkh" event={"ID":"bd479800-84dd-48a6-96f5-851b183494b4","Type":"ContainerStarted","Data":"7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155"} Mar 13 12:05:53 crc kubenswrapper[4786]: I0313 12:05:53.092581 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2bbkh" podStartSLOduration=1.558760447 podStartE2EDuration="6.092563143s" podCreationTimestamp="2026-03-13 12:05:47 +0000 UTC" firstStartedPulling="2026-03-13 12:05:48.038995409 +0000 UTC m=+1135.318648856" lastFinishedPulling="2026-03-13 12:05:52.572798085 +0000 UTC m=+1139.852451552" observedRunningTime="2026-03-13 12:05:53.089683345 +0000 UTC m=+1140.369336812" watchObservedRunningTime="2026-03-13 12:05:53.092563143 +0000 UTC m=+1140.372216590" Mar 13 12:05:56 crc kubenswrapper[4786]: I0313 12:05:56.274164 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-cgmkc" Mar 13 12:05:57 crc kubenswrapper[4786]: I0313 12:05:57.404140 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:57 crc kubenswrapper[4786]: I0313 12:05:57.404245 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:57 crc kubenswrapper[4786]: I0313 12:05:57.448097 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:05:58 crc kubenswrapper[4786]: I0313 12:05:58.144431 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.131576 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556726-9kvnq"] Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.132502 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.134681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.137691 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.138423 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.139114 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-9kvnq"] Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.284957 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6p9\" (UniqueName: \"kubernetes.io/projected/cb268aac-2924-44f6-9f5a-1cd5a3c770a6-kube-api-access-8x6p9\") pod \"auto-csr-approver-29556726-9kvnq\" (UID: \"cb268aac-2924-44f6-9f5a-1cd5a3c770a6\") " pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.386608 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6p9\" (UniqueName: \"kubernetes.io/projected/cb268aac-2924-44f6-9f5a-1cd5a3c770a6-kube-api-access-8x6p9\") pod \"auto-csr-approver-29556726-9kvnq\" (UID: \"cb268aac-2924-44f6-9f5a-1cd5a3c770a6\") " pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.411736 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6p9\" (UniqueName: \"kubernetes.io/projected/cb268aac-2924-44f6-9f5a-1cd5a3c770a6-kube-api-access-8x6p9\") pod \"auto-csr-approver-29556726-9kvnq\" (UID: \"cb268aac-2924-44f6-9f5a-1cd5a3c770a6\") " pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.493649 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:00 crc kubenswrapper[4786]: I0313 12:06:00.675478 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-9kvnq"] Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.068028 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bbkh"] Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.068289 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2bbkh" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="registry-server" containerID="cri-o://7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155" gracePeriod=2 Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.127371 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" event={"ID":"cb268aac-2924-44f6-9f5a-1cd5a3c770a6","Type":"ContainerStarted","Data":"d1ce29137e7a1e6658c4dc477e5007a999517464f57f2e3ca6b491adc756d7b8"} Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.388200 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.505367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-catalog-content\") pod \"bd479800-84dd-48a6-96f5-851b183494b4\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.505477 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtgsr\" (UniqueName: \"kubernetes.io/projected/bd479800-84dd-48a6-96f5-851b183494b4-kube-api-access-gtgsr\") pod \"bd479800-84dd-48a6-96f5-851b183494b4\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.505510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-utilities\") pod \"bd479800-84dd-48a6-96f5-851b183494b4\" (UID: \"bd479800-84dd-48a6-96f5-851b183494b4\") " Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.507528 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-utilities" (OuterVolumeSpecName: "utilities") pod "bd479800-84dd-48a6-96f5-851b183494b4" (UID: "bd479800-84dd-48a6-96f5-851b183494b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.514028 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd479800-84dd-48a6-96f5-851b183494b4-kube-api-access-gtgsr" (OuterVolumeSpecName: "kube-api-access-gtgsr") pod "bd479800-84dd-48a6-96f5-851b183494b4" (UID: "bd479800-84dd-48a6-96f5-851b183494b4"). InnerVolumeSpecName "kube-api-access-gtgsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.548226 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd479800-84dd-48a6-96f5-851b183494b4" (UID: "bd479800-84dd-48a6-96f5-851b183494b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.607358 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.607391 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtgsr\" (UniqueName: \"kubernetes.io/projected/bd479800-84dd-48a6-96f5-851b183494b4-kube-api-access-gtgsr\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:01 crc kubenswrapper[4786]: I0313 12:06:01.607403 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd479800-84dd-48a6-96f5-851b183494b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.138530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" event={"ID":"cb268aac-2924-44f6-9f5a-1cd5a3c770a6","Type":"ContainerStarted","Data":"9f7c10ddaa42f6b54b6af8e66973ed9e444fdeff3412fa2305b7b72c37c849f7"} Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.140564 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd479800-84dd-48a6-96f5-851b183494b4" containerID="7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155" exitCode=0 Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.140594 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bbkh" event={"ID":"bd479800-84dd-48a6-96f5-851b183494b4","Type":"ContainerDied","Data":"7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155"} Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.140616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bbkh" event={"ID":"bd479800-84dd-48a6-96f5-851b183494b4","Type":"ContainerDied","Data":"40fc84aefd6ee0cbafdde444edc59705477c1554e2a10939db6e38109b5e312e"} Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.140633 4786 scope.go:117] "RemoveContainer" containerID="7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.140657 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bbkh" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.163185 4786 scope.go:117] "RemoveContainer" containerID="3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.167403 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" podStartSLOduration=1.047715969 podStartE2EDuration="2.16737788s" podCreationTimestamp="2026-03-13 12:06:00 +0000 UTC" firstStartedPulling="2026-03-13 12:06:00.689483865 +0000 UTC m=+1147.969137312" lastFinishedPulling="2026-03-13 12:06:01.809145776 +0000 UTC m=+1149.088799223" observedRunningTime="2026-03-13 12:06:02.158567082 +0000 UTC m=+1149.438220539" watchObservedRunningTime="2026-03-13 12:06:02.16737788 +0000 UTC m=+1149.447031367" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.177061 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bbkh"] Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.186205 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bbkh"] Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.194072 4786 scope.go:117] "RemoveContainer" containerID="533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.211304 4786 scope.go:117] "RemoveContainer" containerID="7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155" Mar 13 12:06:02 crc kubenswrapper[4786]: E0313 12:06:02.211679 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155\": container with ID starting with 7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155 not found: ID does not exist" containerID="7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.211708 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155"} err="failed to get container status \"7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155\": rpc error: code = NotFound desc = could not find container \"7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155\": container with ID starting with 7f8b8db7725209eb1457590871082201a0a3232f5255667f458854beca47a155 not found: ID does not exist" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.211727 4786 scope.go:117] "RemoveContainer" containerID="3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31" Mar 13 12:06:02 crc kubenswrapper[4786]: E0313 12:06:02.212132 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31\": container with ID starting with 3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31 not found: ID does not exist" containerID="3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.212158 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31"} err="failed to get container status \"3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31\": rpc error: code = NotFound desc = could not find container \"3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31\": container with ID starting with 3da255aec43441f529ea5bd49fda12d489d5ba24d1d44d097959dcede3a16b31 not found: ID does not exist" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.212173 4786 scope.go:117] "RemoveContainer" containerID="533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3" Mar 13 12:06:02 crc kubenswrapper[4786]: E0313 12:06:02.212550 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3\": container with ID starting with 533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3 not found: ID does not exist" containerID="533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3" Mar 13 12:06:02 crc kubenswrapper[4786]: I0313 12:06:02.212588 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3"} err="failed to get container status \"533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3\": rpc error: code = NotFound desc = could not find container \"533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3\": container with ID starting with 533bfaaabe754ddda060fc5bc69b356b946f0f6ae1163e5e923c136f5be97ef3 not found: ID does not exist" Mar 13 12:06:03 crc kubenswrapper[4786]: I0313 12:06:03.151102 4786 generic.go:334] "Generic (PLEG): container finished" podID="cb268aac-2924-44f6-9f5a-1cd5a3c770a6" containerID="9f7c10ddaa42f6b54b6af8e66973ed9e444fdeff3412fa2305b7b72c37c849f7" exitCode=0 Mar 13 12:06:03 crc kubenswrapper[4786]: I0313 12:06:03.151149 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" event={"ID":"cb268aac-2924-44f6-9f5a-1cd5a3c770a6","Type":"ContainerDied","Data":"9f7c10ddaa42f6b54b6af8e66973ed9e444fdeff3412fa2305b7b72c37c849f7"} Mar 13 12:06:03 crc kubenswrapper[4786]: I0313 12:06:03.448380 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd479800-84dd-48a6-96f5-851b183494b4" path="/var/lib/kubelet/pods/bd479800-84dd-48a6-96f5-851b183494b4/volumes" Mar 13 12:06:04 crc kubenswrapper[4786]: I0313 12:06:04.395364 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:04 crc kubenswrapper[4786]: I0313 12:06:04.546686 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x6p9\" (UniqueName: \"kubernetes.io/projected/cb268aac-2924-44f6-9f5a-1cd5a3c770a6-kube-api-access-8x6p9\") pod \"cb268aac-2924-44f6-9f5a-1cd5a3c770a6\" (UID: \"cb268aac-2924-44f6-9f5a-1cd5a3c770a6\") " Mar 13 12:06:04 crc kubenswrapper[4786]: I0313 12:06:04.551750 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb268aac-2924-44f6-9f5a-1cd5a3c770a6-kube-api-access-8x6p9" (OuterVolumeSpecName: "kube-api-access-8x6p9") pod "cb268aac-2924-44f6-9f5a-1cd5a3c770a6" (UID: "cb268aac-2924-44f6-9f5a-1cd5a3c770a6"). InnerVolumeSpecName "kube-api-access-8x6p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:04 crc kubenswrapper[4786]: I0313 12:06:04.648463 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x6p9\" (UniqueName: \"kubernetes.io/projected/cb268aac-2924-44f6-9f5a-1cd5a3c770a6-kube-api-access-8x6p9\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:05 crc kubenswrapper[4786]: I0313 12:06:05.174935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" event={"ID":"cb268aac-2924-44f6-9f5a-1cd5a3c770a6","Type":"ContainerDied","Data":"d1ce29137e7a1e6658c4dc477e5007a999517464f57f2e3ca6b491adc756d7b8"} Mar 13 12:06:05 crc kubenswrapper[4786]: I0313 12:06:05.174982 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ce29137e7a1e6658c4dc477e5007a999517464f57f2e3ca6b491adc756d7b8" Mar 13 12:06:05 crc kubenswrapper[4786]: I0313 12:06:05.175020 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-9kvnq" Mar 13 12:06:05 crc kubenswrapper[4786]: I0313 12:06:05.223021 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-zx752"] Mar 13 12:06:05 crc kubenswrapper[4786]: I0313 12:06:05.230134 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-zx752"] Mar 13 12:06:05 crc kubenswrapper[4786]: I0313 12:06:05.446723 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce0f979-89dc-435e-abf3-0a4a4102c338" path="/var/lib/kubelet/pods/9ce0f979-89dc-435e-abf3-0a4a4102c338/volumes" Mar 13 12:06:10 crc kubenswrapper[4786]: I0313 12:06:10.738675 4786 scope.go:117] "RemoveContainer" containerID="800125b044ed448a929edb34f5731e31d2715a0df4bf5ac7f81b181924e5eabe" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.925008 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd"] Mar 13 12:06:34 crc kubenswrapper[4786]: E0313 12:06:34.925833 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="extract-content" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.925849 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="extract-content" Mar 13 12:06:34 crc kubenswrapper[4786]: E0313 12:06:34.925906 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="registry-server" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.925918 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="registry-server" Mar 13 12:06:34 crc kubenswrapper[4786]: E0313 12:06:34.925938 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="extract-utilities" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.925946 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="extract-utilities" Mar 13 12:06:34 crc kubenswrapper[4786]: E0313 12:06:34.925959 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb268aac-2924-44f6-9f5a-1cd5a3c770a6" containerName="oc" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.925966 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb268aac-2924-44f6-9f5a-1cd5a3c770a6" containerName="oc" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.926100 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd479800-84dd-48a6-96f5-851b183494b4" containerName="registry-server" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.926116 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb268aac-2924-44f6-9f5a-1cd5a3c770a6" containerName="oc" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.926634 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.930242 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cjjkw" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.931297 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5"] Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.935708 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.937425 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qpkvk" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.942252 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd"] Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.948665 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5"] Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.965660 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l"] Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.966329 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l"] Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.966789 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.967214 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.971939 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-k79gw" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.980268 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-lj8sp" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.981867 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87rt\" (UniqueName: \"kubernetes.io/projected/4eb75275-4f14-406c-950a-fa40061041af-kube-api-access-j87rt\") pod \"cinder-operator-controller-manager-984cd4dcf-652d5\" (UID: \"4eb75275-4f14-406c-950a-fa40061041af\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.981992 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnkzf\" (UniqueName: \"kubernetes.io/projected/8ba2beff-196b-4a24-a490-86a81b9f7495-kube-api-access-rnkzf\") pod \"glance-operator-controller-manager-5964f64c48-kgm9l\" (UID: \"8ba2beff-196b-4a24-a490-86a81b9f7495\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.982038 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgsm\" (UniqueName: \"kubernetes.io/projected/47c63b16-0044-4bef-848e-084b958e853b-kube-api-access-tdgsm\") pod \"designate-operator-controller-manager-66d56f6ff4-wk47l\" (UID: \"47c63b16-0044-4bef-848e-084b958e853b\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.982059 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9f6t\" (UniqueName: \"kubernetes.io/projected/28f6ab30-9436-45e7-a94f-b9757e0dc331-kube-api-access-x9f6t\") pod \"barbican-operator-controller-manager-677bd678f7-wrxdd\" (UID: \"28f6ab30-9436-45e7-a94f-b9757e0dc331\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:06:34 crc kubenswrapper[4786]: I0313 12:06:34.988800 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.017451 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.019743 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.023980 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-c4jz4" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.053748 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.079903 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.087095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnkzf\" (UniqueName: \"kubernetes.io/projected/8ba2beff-196b-4a24-a490-86a81b9f7495-kube-api-access-rnkzf\") pod \"glance-operator-controller-manager-5964f64c48-kgm9l\" (UID: \"8ba2beff-196b-4a24-a490-86a81b9f7495\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.087195 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgsm\" (UniqueName: \"kubernetes.io/projected/47c63b16-0044-4bef-848e-084b958e853b-kube-api-access-tdgsm\") pod \"designate-operator-controller-manager-66d56f6ff4-wk47l\" (UID: \"47c63b16-0044-4bef-848e-084b958e853b\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.087231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9f6t\" (UniqueName: \"kubernetes.io/projected/28f6ab30-9436-45e7-a94f-b9757e0dc331-kube-api-access-x9f6t\") pod \"barbican-operator-controller-manager-677bd678f7-wrxdd\" (UID: \"28f6ab30-9436-45e7-a94f-b9757e0dc331\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.087269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j87rt\" (UniqueName: \"kubernetes.io/projected/4eb75275-4f14-406c-950a-fa40061041af-kube-api-access-j87rt\") pod \"cinder-operator-controller-manager-984cd4dcf-652d5\" (UID: \"4eb75275-4f14-406c-950a-fa40061041af\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.088677 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.089627 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.098355 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-thr75"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.099231 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ckn7k" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.099281 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.109488 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d9kpz" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.109719 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.119530 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87rt\" (UniqueName: \"kubernetes.io/projected/4eb75275-4f14-406c-950a-fa40061041af-kube-api-access-j87rt\") pod \"cinder-operator-controller-manager-984cd4dcf-652d5\" (UID: \"4eb75275-4f14-406c-950a-fa40061041af\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.120407 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9f6t\" (UniqueName: \"kubernetes.io/projected/28f6ab30-9436-45e7-a94f-b9757e0dc331-kube-api-access-x9f6t\") pod \"barbican-operator-controller-manager-677bd678f7-wrxdd\" (UID: \"28f6ab30-9436-45e7-a94f-b9757e0dc331\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.130844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgsm\" (UniqueName: \"kubernetes.io/projected/47c63b16-0044-4bef-848e-084b958e853b-kube-api-access-tdgsm\") pod \"designate-operator-controller-manager-66d56f6ff4-wk47l\" (UID: \"47c63b16-0044-4bef-848e-084b958e853b\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.133407 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnkzf\" (UniqueName: \"kubernetes.io/projected/8ba2beff-196b-4a24-a490-86a81b9f7495-kube-api-access-rnkzf\") pod \"glance-operator-controller-manager-5964f64c48-kgm9l\" (UID: \"8ba2beff-196b-4a24-a490-86a81b9f7495\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.145353 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.158818 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.160009 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.162126 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-76j9s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.175498 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-thr75"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.188252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgrq6\" (UniqueName: \"kubernetes.io/projected/17f6df05-f37f-4863-b967-7b27429282f2-kube-api-access-jgrq6\") pod \"heat-operator-controller-manager-77b6666d85-4jnjg\" (UID: \"17f6df05-f37f-4863-b967-7b27429282f2\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.193895 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.194750 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.196731 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-htxvn" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.199623 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.203747 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.207451 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.208292 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.209932 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-czzx4" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.217124 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.218215 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.220444 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4856k" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.224837 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.228944 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.235045 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.236196 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.239045 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kczvd" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.248976 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.261403 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.262409 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.265373 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.266066 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.266435 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-x4t5c" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.267990 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zt86h" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.272573 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.274250 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.285399 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.290538 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfmjj\" (UniqueName: \"kubernetes.io/projected/77819c69-e0b5-4eb8-a124-fb1339701ccb-kube-api-access-nfmjj\") pod \"horizon-operator-controller-manager-6d9d6b584d-9w5pr\" (UID: \"77819c69-e0b5-4eb8-a124-fb1339701ccb\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.291102 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgrq6\" (UniqueName: \"kubernetes.io/projected/17f6df05-f37f-4863-b967-7b27429282f2-kube-api-access-jgrq6\") pod \"heat-operator-controller-manager-77b6666d85-4jnjg\" (UID: \"17f6df05-f37f-4863-b967-7b27429282f2\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.291155 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrssv\" (UniqueName: \"kubernetes.io/projected/e4075ad0-d00e-4675-97e6-87e1d7e845d9-kube-api-access-nrssv\") pod \"ironic-operator-controller-manager-6bbb499bbc-mf6t8\" (UID: \"e4075ad0-d00e-4675-97e6-87e1d7e845d9\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.291189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.291239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zzb\" (UniqueName: \"kubernetes.io/projected/784ee575-162b-4732-b82c-8f4b3c1e5317-kube-api-access-44zzb\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.301910 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.307610 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.316444 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgrq6\" (UniqueName: \"kubernetes.io/projected/17f6df05-f37f-4863-b967-7b27429282f2-kube-api-access-jgrq6\") pod \"heat-operator-controller-manager-77b6666d85-4jnjg\" (UID: \"17f6df05-f37f-4863-b967-7b27429282f2\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.316509 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.326529 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.333765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.336472 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.336722 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-76p8h" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.337516 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.357579 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.368495 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.369459 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.371580 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-wb9hc" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.390055 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393412 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l99h\" (UniqueName: \"kubernetes.io/projected/810beff6-dacb-486e-be5b-fc4ad06e12d3-kube-api-access-4l99h\") pod \"ovn-operator-controller-manager-bbc5b68f9-4skd6\" (UID: \"810beff6-dacb-486e-be5b-fc4ad06e12d3\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393447 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqk4\" (UniqueName: \"kubernetes.io/projected/1073de3d-8bac-4236-a9ce-c78d7bb2865b-kube-api-access-rsqk4\") pod \"neutron-operator-controller-manager-776c5696bf-stg56\" (UID: \"1073de3d-8bac-4236-a9ce-c78d7bb2865b\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393530 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfmjj\" (UniqueName: \"kubernetes.io/projected/77819c69-e0b5-4eb8-a124-fb1339701ccb-kube-api-access-nfmjj\") pod \"horizon-operator-controller-manager-6d9d6b584d-9w5pr\" (UID: \"77819c69-e0b5-4eb8-a124-fb1339701ccb\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrssv\" (UniqueName: \"kubernetes.io/projected/e4075ad0-d00e-4675-97e6-87e1d7e845d9-kube-api-access-nrssv\") pod \"ironic-operator-controller-manager-6bbb499bbc-mf6t8\" (UID: \"e4075ad0-d00e-4675-97e6-87e1d7e845d9\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393624 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393650 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nflcl\" (UniqueName: \"kubernetes.io/projected/8902cfaa-8c11-4e52-9f6d-d579e6cd50f5-kube-api-access-nflcl\") pod \"nova-operator-controller-manager-569cc54c5-fnchb\" (UID: \"8902cfaa-8c11-4e52-9f6d-d579e6cd50f5\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393680 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2lxv\" (UniqueName: \"kubernetes.io/projected/3aeac64d-7cf0-407c-a460-423a0082a8e9-kube-api-access-r2lxv\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393711 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzg8m\" (UniqueName: \"kubernetes.io/projected/ad8d13c6-f90b-4eb4-adce-1d20f690cc98-kube-api-access-rzg8m\") pod \"manila-operator-controller-manager-68f45f9d9f-6dcrg\" (UID: \"ad8d13c6-f90b-4eb4-adce-1d20f690cc98\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393745 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7xq\" (UniqueName: \"kubernetes.io/projected/52ab49ac-37a7-4ba5-a2c3-9113b6821a5d-kube-api-access-jd7xq\") pod \"octavia-operator-controller-manager-5f4f55cb5c-l2vgk\" (UID: \"52ab49ac-37a7-4ba5-a2c3-9113b6821a5d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zzb\" (UniqueName: \"kubernetes.io/projected/784ee575-162b-4732-b82c-8f4b3c1e5317-kube-api-access-44zzb\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393796 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26879\" (UniqueName: \"kubernetes.io/projected/f87ad580-b279-47e4-8fdd-462285c7bead-kube-api-access-26879\") pod \"keystone-operator-controller-manager-684f77d66d-lj4bv\" (UID: \"f87ad580-b279-47e4-8fdd-462285c7bead\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.393823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpzl2\" (UniqueName: \"kubernetes.io/projected/8c1d644e-a547-48c7-bda5-95cdb6c0220f-kube-api-access-wpzl2\") pod \"mariadb-operator-controller-manager-658d4cdd5-rdph5\" (UID: \"8c1d644e-a547-48c7-bda5-95cdb6c0220f\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.394486 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.394537 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert podName:784ee575-162b-4732-b82c-8f4b3c1e5317 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:35.894518694 +0000 UTC m=+1183.174172141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert") pod "infra-operator-controller-manager-5995f4446f-thr75" (UID: "784ee575-162b-4732-b82c-8f4b3c1e5317") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.405945 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c284s"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.406831 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.415348 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-c6cnb" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.416809 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zzb\" (UniqueName: \"kubernetes.io/projected/784ee575-162b-4732-b82c-8f4b3c1e5317-kube-api-access-44zzb\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.418292 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrssv\" (UniqueName: \"kubernetes.io/projected/e4075ad0-d00e-4675-97e6-87e1d7e845d9-kube-api-access-nrssv\") pod \"ironic-operator-controller-manager-6bbb499bbc-mf6t8\" (UID: \"e4075ad0-d00e-4675-97e6-87e1d7e845d9\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.426185 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c284s"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.430159 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.431209 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.437093 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfmjj\" (UniqueName: \"kubernetes.io/projected/77819c69-e0b5-4eb8-a124-fb1339701ccb-kube-api-access-nfmjj\") pod \"horizon-operator-controller-manager-6d9d6b584d-9w5pr\" (UID: \"77819c69-e0b5-4eb8-a124-fb1339701ccb\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.441915 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-wlz4k" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.477567 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nflcl\" (UniqueName: \"kubernetes.io/projected/8902cfaa-8c11-4e52-9f6d-d579e6cd50f5-kube-api-access-nflcl\") pod \"nova-operator-controller-manager-569cc54c5-fnchb\" (UID: \"8902cfaa-8c11-4e52-9f6d-d579e6cd50f5\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2lxv\" (UniqueName: \"kubernetes.io/projected/3aeac64d-7cf0-407c-a460-423a0082a8e9-kube-api-access-r2lxv\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzg8m\" (UniqueName: \"kubernetes.io/projected/ad8d13c6-f90b-4eb4-adce-1d20f690cc98-kube-api-access-rzg8m\") pod \"manila-operator-controller-manager-68f45f9d9f-6dcrg\" (UID: \"ad8d13c6-f90b-4eb4-adce-1d20f690cc98\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494680 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7xq\" (UniqueName: \"kubernetes.io/projected/52ab49ac-37a7-4ba5-a2c3-9113b6821a5d-kube-api-access-jd7xq\") pod \"octavia-operator-controller-manager-5f4f55cb5c-l2vgk\" (UID: \"52ab49ac-37a7-4ba5-a2c3-9113b6821a5d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26879\" (UniqueName: \"kubernetes.io/projected/f87ad580-b279-47e4-8fdd-462285c7bead-kube-api-access-26879\") pod \"keystone-operator-controller-manager-684f77d66d-lj4bv\" (UID: \"f87ad580-b279-47e4-8fdd-462285c7bead\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpzl2\" (UniqueName: \"kubernetes.io/projected/8c1d644e-a547-48c7-bda5-95cdb6c0220f-kube-api-access-wpzl2\") pod \"mariadb-operator-controller-manager-658d4cdd5-rdph5\" (UID: \"8c1d644e-a547-48c7-bda5-95cdb6c0220f\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494738 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l99h\" (UniqueName: \"kubernetes.io/projected/810beff6-dacb-486e-be5b-fc4ad06e12d3-kube-api-access-4l99h\") pod \"ovn-operator-controller-manager-bbc5b68f9-4skd6\" (UID: \"810beff6-dacb-486e-be5b-fc4ad06e12d3\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqk4\" (UniqueName: \"kubernetes.io/projected/1073de3d-8bac-4236-a9ce-c78d7bb2865b-kube-api-access-rsqk4\") pod \"neutron-operator-controller-manager-776c5696bf-stg56\" (UID: \"1073de3d-8bac-4236-a9ce-c78d7bb2865b\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494817 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82hhw\" (UniqueName: \"kubernetes.io/projected/d5b27da0-841c-49b1-b761-a9f61a402f6c-kube-api-access-82hhw\") pod \"placement-operator-controller-manager-574d45c66c-c284s\" (UID: \"d5b27da0-841c-49b1-b761-a9f61a402f6c\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494873 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.494928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttm5b\" (UniqueName: \"kubernetes.io/projected/f53bfcff-fb8e-46d3-8818-39147c6ac29b-kube-api-access-ttm5b\") pod \"swift-operator-controller-manager-677c674df7-8qxfc\" (UID: \"f53bfcff-fb8e-46d3-8818-39147c6ac29b\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.495520 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.495595 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert podName:3aeac64d-7cf0-407c-a460-423a0082a8e9 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:35.995575382 +0000 UTC m=+1183.275228829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" (UID: "3aeac64d-7cf0-407c-a460-423a0082a8e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.509719 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.537463 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.538392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.541515 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5nh4s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.547641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l99h\" (UniqueName: \"kubernetes.io/projected/810beff6-dacb-486e-be5b-fc4ad06e12d3-kube-api-access-4l99h\") pod \"ovn-operator-controller-manager-bbc5b68f9-4skd6\" (UID: \"810beff6-dacb-486e-be5b-fc4ad06e12d3\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.547823 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqk4\" (UniqueName: \"kubernetes.io/projected/1073de3d-8bac-4236-a9ce-c78d7bb2865b-kube-api-access-rsqk4\") pod \"neutron-operator-controller-manager-776c5696bf-stg56\" (UID: \"1073de3d-8bac-4236-a9ce-c78d7bb2865b\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.549151 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2lxv\" (UniqueName: \"kubernetes.io/projected/3aeac64d-7cf0-407c-a460-423a0082a8e9-kube-api-access-r2lxv\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.549434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzg8m\" (UniqueName: \"kubernetes.io/projected/ad8d13c6-f90b-4eb4-adce-1d20f690cc98-kube-api-access-rzg8m\") pod \"manila-operator-controller-manager-68f45f9d9f-6dcrg\" (UID: \"ad8d13c6-f90b-4eb4-adce-1d20f690cc98\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.557269 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.560831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26879\" (UniqueName: \"kubernetes.io/projected/f87ad580-b279-47e4-8fdd-462285c7bead-kube-api-access-26879\") pod \"keystone-operator-controller-manager-684f77d66d-lj4bv\" (UID: \"f87ad580-b279-47e4-8fdd-462285c7bead\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.561136 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nflcl\" (UniqueName: \"kubernetes.io/projected/8902cfaa-8c11-4e52-9f6d-d579e6cd50f5-kube-api-access-nflcl\") pod \"nova-operator-controller-manager-569cc54c5-fnchb\" (UID: \"8902cfaa-8c11-4e52-9f6d-d579e6cd50f5\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.567356 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpzl2\" (UniqueName: \"kubernetes.io/projected/8c1d644e-a547-48c7-bda5-95cdb6c0220f-kube-api-access-wpzl2\") pod \"mariadb-operator-controller-manager-658d4cdd5-rdph5\" (UID: \"8c1d644e-a547-48c7-bda5-95cdb6c0220f\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.568996 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.569441 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7xq\" (UniqueName: \"kubernetes.io/projected/52ab49ac-37a7-4ba5-a2c3-9113b6821a5d-kube-api-access-jd7xq\") pod \"octavia-operator-controller-manager-5f4f55cb5c-l2vgk\" (UID: \"52ab49ac-37a7-4ba5-a2c3-9113b6821a5d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.590383 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.591490 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.596556 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qhmbn" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.597287 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brg6t\" (UniqueName: \"kubernetes.io/projected/9c83fbda-99d5-4661-9ca4-24008f71bb98-kube-api-access-brg6t\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-gw46v\" (UID: \"9c83fbda-99d5-4661-9ca4-24008f71bb98\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.597356 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82hhw\" (UniqueName: \"kubernetes.io/projected/d5b27da0-841c-49b1-b761-a9f61a402f6c-kube-api-access-82hhw\") pod \"placement-operator-controller-manager-574d45c66c-c284s\" (UID: \"d5b27da0-841c-49b1-b761-a9f61a402f6c\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.597378 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78xn\" (UniqueName: \"kubernetes.io/projected/d742678a-b8a2-409a-932d-3b7002db7636-kube-api-access-m78xn\") pod \"test-operator-controller-manager-5c5cb9c4d7-pl6k5\" (UID: \"d742678a-b8a2-409a-932d-3b7002db7636\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.597426 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttm5b\" (UniqueName: \"kubernetes.io/projected/f53bfcff-fb8e-46d3-8818-39147c6ac29b-kube-api-access-ttm5b\") pod \"swift-operator-controller-manager-677c674df7-8qxfc\" (UID: \"f53bfcff-fb8e-46d3-8818-39147c6ac29b\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.619803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82hhw\" (UniqueName: \"kubernetes.io/projected/d5b27da0-841c-49b1-b761-a9f61a402f6c-kube-api-access-82hhw\") pod \"placement-operator-controller-manager-574d45c66c-c284s\" (UID: \"d5b27da0-841c-49b1-b761-a9f61a402f6c\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.625283 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.629072 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttm5b\" (UniqueName: \"kubernetes.io/projected/f53bfcff-fb8e-46d3-8818-39147c6ac29b-kube-api-access-ttm5b\") pod \"swift-operator-controller-manager-677c674df7-8qxfc\" (UID: \"f53bfcff-fb8e-46d3-8818-39147c6ac29b\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.635909 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.636979 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.640564 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.640994 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xzb89" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.649999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.672266 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.679724 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.680506 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.684095 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.684134 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ftp9m" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.684336 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.696099 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.698599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.698648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brg6t\" (UniqueName: \"kubernetes.io/projected/9c83fbda-99d5-4661-9ca4-24008f71bb98-kube-api-access-brg6t\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-gw46v\" (UID: \"9c83fbda-99d5-4661-9ca4-24008f71bb98\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.698697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgggm\" (UniqueName: \"kubernetes.io/projected/9b7d27b4-b437-4bfb-b888-97b406ceb185-kube-api-access-xgggm\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.698730 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.698748 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78xn\" (UniqueName: \"kubernetes.io/projected/d742678a-b8a2-409a-932d-3b7002db7636-kube-api-access-m78xn\") pod \"test-operator-controller-manager-5c5cb9c4d7-pl6k5\" (UID: \"d742678a-b8a2-409a-932d-3b7002db7636\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.698790 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq2m2\" (UniqueName: \"kubernetes.io/projected/c18db346-860e-487c-b232-6f404fdb1b7c-kube-api-access-hq2m2\") pod \"watcher-operator-controller-manager-6dd88c6f67-4n8g4\" (UID: \"c18db346-860e-487c-b232-6f404fdb1b7c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.699158 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.720321 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brg6t\" (UniqueName: \"kubernetes.io/projected/9c83fbda-99d5-4661-9ca4-24008f71bb98-kube-api-access-brg6t\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-gw46v\" (UID: \"9c83fbda-99d5-4661-9ca4-24008f71bb98\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.728347 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78xn\" (UniqueName: \"kubernetes.io/projected/d742678a-b8a2-409a-932d-3b7002db7636-kube-api-access-m78xn\") pod \"test-operator-controller-manager-5c5cb9c4d7-pl6k5\" (UID: \"d742678a-b8a2-409a-932d-3b7002db7636\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.728675 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.742063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.766407 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.771617 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.773000 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.777517 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jqxtg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.780796 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.787336 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.799539 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgggm\" (UniqueName: \"kubernetes.io/projected/9b7d27b4-b437-4bfb-b888-97b406ceb185-kube-api-access-xgggm\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.800501 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.800588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq2m2\" (UniqueName: \"kubernetes.io/projected/c18db346-860e-487c-b232-6f404fdb1b7c-kube-api-access-hq2m2\") pod \"watcher-operator-controller-manager-6dd88c6f67-4n8g4\" (UID: \"c18db346-860e-487c-b232-6f404fdb1b7c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.800648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.800771 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.800822 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:36.30080354 +0000 UTC m=+1183.580456987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.801042 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.801110 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:36.301093488 +0000 UTC m=+1183.580746925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "metrics-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.816514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgggm\" (UniqueName: \"kubernetes.io/projected/9b7d27b4-b437-4bfb-b888-97b406ceb185-kube-api-access-xgggm\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.828529 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq2m2\" (UniqueName: \"kubernetes.io/projected/c18db346-860e-487c-b232-6f404fdb1b7c-kube-api-access-hq2m2\") pod \"watcher-operator-controller-manager-6dd88c6f67-4n8g4\" (UID: \"c18db346-860e-487c-b232-6f404fdb1b7c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.828925 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.829899 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.841747 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.871477 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.872219 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.901082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzv2w\" (UniqueName: \"kubernetes.io/projected/c087892e-22b2-4552-a57f-e1c1d75b7917-kube-api-access-rzv2w\") pod \"rabbitmq-cluster-operator-manager-668c99d594-22b2x\" (UID: \"c087892e-22b2-4552-a57f-e1c1d75b7917\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.902626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.902773 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: E0313 12:06:35.902819 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert podName:784ee575-162b-4732-b82c-8f4b3c1e5317 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:36.902803633 +0000 UTC m=+1184.182457080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert") pod "infra-operator-controller-manager-5995f4446f-thr75" (UID: "784ee575-162b-4732-b82c-8f4b3c1e5317") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.919041 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.924627 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5"] Mar 13 12:06:35 crc kubenswrapper[4786]: I0313 12:06:35.974247 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.003036 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.003082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzv2w\" (UniqueName: \"kubernetes.io/projected/c087892e-22b2-4552-a57f-e1c1d75b7917-kube-api-access-rzv2w\") pod \"rabbitmq-cluster-operator-manager-668c99d594-22b2x\" (UID: \"c087892e-22b2-4552-a57f-e1c1d75b7917\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.003548 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.003601 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert podName:3aeac64d-7cf0-407c-a460-423a0082a8e9 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:37.003583944 +0000 UTC m=+1184.283237391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" (UID: "3aeac64d-7cf0-407c-a460-423a0082a8e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.037381 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.038606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzv2w\" (UniqueName: \"kubernetes.io/projected/c087892e-22b2-4552-a57f-e1c1d75b7917-kube-api-access-rzv2w\") pod \"rabbitmq-cluster-operator-manager-668c99d594-22b2x\" (UID: \"c087892e-22b2-4552-a57f-e1c1d75b7917\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.090936 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.306396 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.306788 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.306558 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.306985 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.307057 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:37.307038015 +0000 UTC m=+1184.586691472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "webhook-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.307171 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:37.307140747 +0000 UTC m=+1184.586794294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "metrics-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.378653 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.398735 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg"] Mar 13 12:06:36 crc kubenswrapper[4786]: W0313 12:06:36.402017 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f6df05_f37f_4863_b967_7b27429282f2.slice/crio-277f11eee13204bf52b1c61d2e91a36bc5242cfe4ffe119cf3bb8def73de0e24 WatchSource:0}: Error finding container 277f11eee13204bf52b1c61d2e91a36bc5242cfe4ffe119cf3bb8def73de0e24: Status 404 returned error can't find the container with id 277f11eee13204bf52b1c61d2e91a36bc5242cfe4ffe119cf3bb8def73de0e24 Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.417828 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" event={"ID":"28f6ab30-9436-45e7-a94f-b9757e0dc331","Type":"ContainerStarted","Data":"40bb5e265aa20588999696c628df1bbeb34fa4080860bb50cc3d8abf6c5cf6e5"} Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.419595 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" event={"ID":"8ba2beff-196b-4a24-a490-86a81b9f7495","Type":"ContainerStarted","Data":"736fb600e98851c6c7f5aa5a30327e19d74ab092d6c72c8572a1c39ffda1e57e"} Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.422160 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" event={"ID":"47c63b16-0044-4bef-848e-084b958e853b","Type":"ContainerStarted","Data":"e951b5fbc2fdd42f8e03aa4a57ef1af4d70b7775f888dcde171bb106420e59e7"} Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.423551 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" event={"ID":"4eb75275-4f14-406c-950a-fa40061041af","Type":"ContainerStarted","Data":"6424f05b3ac2accc7274a3b3114c8af3c9ed5ac8eb0c991383adbb6746761093"} Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.509199 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.518759 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.532480 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.759339 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5"] Mar 13 12:06:36 crc kubenswrapper[4786]: W0313 12:06:36.789374 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c1d644e_a547_48c7_bda5_95cdb6c0220f.slice/crio-9eff2725e2368deadae9e46498b3536e251e986bb0de66ad7c16bf41431c4808 WatchSource:0}: Error finding container 9eff2725e2368deadae9e46498b3536e251e986bb0de66ad7c16bf41431c4808: Status 404 returned error can't find the container with id 9eff2725e2368deadae9e46498b3536e251e986bb0de66ad7c16bf41431c4808 Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.794744 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.815669 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.826812 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.842138 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v"] Mar 13 12:06:36 crc kubenswrapper[4786]: W0313 12:06:36.847354 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c83fbda_99d5_4661_9ca4_24008f71bb98.slice/crio-915b90ebb1f8da45591d5880a721e11e6f35c4598f63735d656af2310d28b1b2 WatchSource:0}: Error finding container 915b90ebb1f8da45591d5880a721e11e6f35c4598f63735d656af2310d28b1b2: Status 404 returned error can't find the container with id 915b90ebb1f8da45591d5880a721e11e6f35c4598f63735d656af2310d28b1b2 Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.848026 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4"] Mar 13 12:06:36 crc kubenswrapper[4786]: W0313 12:06:36.853686 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18db346_860e_487c_b232_6f404fdb1b7c.slice/crio-39f4479dc1a823408941dc7c71814a11fb7b5ec742c0fcc53a09e447ebc5c681 WatchSource:0}: Error finding container 39f4479dc1a823408941dc7c71814a11fb7b5ec742c0fcc53a09e447ebc5c681: Status 404 returned error can't find the container with id 39f4479dc1a823408941dc7c71814a11fb7b5ec742c0fcc53a09e447ebc5c681 Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.856069 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr"] Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.861060 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m78xn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-pl6k5_openstack-operators(d742678a-b8a2-409a-932d-3b7002db7636): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.861073 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rzv2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-22b2x_openstack-operators(c087892e-22b2-4552-a57f-e1c1d75b7917): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.861114 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk"] Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.861166 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hq2m2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-4n8g4_openstack-operators(c18db346-860e-487c-b232-6f404fdb1b7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.861402 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82hhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-c284s_openstack-operators(d5b27da0-841c-49b1-b761-a9f61a402f6c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.862145 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" podUID="d742678a-b8a2-409a-932d-3b7002db7636" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.862207 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" podUID="c087892e-22b2-4552-a57f-e1c1d75b7917" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.862307 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" podUID="c18db346-860e-487c-b232-6f404fdb1b7c" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.862463 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" podUID="d5b27da0-841c-49b1-b761-a9f61a402f6c" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.863263 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nfmjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6d9d6b584d-9w5pr_openstack-operators(77819c69-e0b5-4eb8-a124-fb1339701ccb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.865516 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" podUID="77819c69-e0b5-4eb8-a124-fb1339701ccb" Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.888444 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.895491 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c284s"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.898646 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.902845 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x"] Mar 13 12:06:36 crc kubenswrapper[4786]: I0313 12:06:36.915986 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.916188 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:36 crc kubenswrapper[4786]: E0313 12:06:36.916249 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert podName:784ee575-162b-4732-b82c-8f4b3c1e5317 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:38.916231897 +0000 UTC m=+1186.195885354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert") pod "infra-operator-controller-manager-5995f4446f-thr75" (UID: "784ee575-162b-4732-b82c-8f4b3c1e5317") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.018266 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.018423 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.018773 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert podName:3aeac64d-7cf0-407c-a460-423a0082a8e9 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:39.018750504 +0000 UTC m=+1186.298403951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" (UID: "3aeac64d-7cf0-407c-a460-423a0082a8e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.323549 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.323663 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.323793 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.323811 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.323873 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:39.323850489 +0000 UTC m=+1186.603503936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "metrics-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.323941 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:39.323931921 +0000 UTC m=+1186.603585368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "webhook-server-cert" not found Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.431279 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" event={"ID":"f87ad580-b279-47e4-8fdd-462285c7bead","Type":"ContainerStarted","Data":"0ca81d4c8db02de3d6ecb226f8e7c87cb2066536693fffb31a4b6a4678b8a951"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.432539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" event={"ID":"1073de3d-8bac-4236-a9ce-c78d7bb2865b","Type":"ContainerStarted","Data":"f24f4c9156c64b37b6e4bddff9c58971035d6ea0020b337b6392e7dc0e290b5e"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.435297 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" event={"ID":"17f6df05-f37f-4863-b967-7b27429282f2","Type":"ContainerStarted","Data":"277f11eee13204bf52b1c61d2e91a36bc5242cfe4ffe119cf3bb8def73de0e24"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.436439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" event={"ID":"8902cfaa-8c11-4e52-9f6d-d579e6cd50f5","Type":"ContainerStarted","Data":"683d260931a2a9d19e2a0e001f4f7c707945fc99df6e6ced0b6286663ca04802"} Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.445241 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" podUID="c087892e-22b2-4552-a57f-e1c1d75b7917" Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.450741 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" podUID="d5b27da0-841c-49b1-b761-a9f61a402f6c" Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.455967 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" podUID="77819c69-e0b5-4eb8-a124-fb1339701ccb" Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.471010 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" podUID="d742678a-b8a2-409a-932d-3b7002db7636" Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477042 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" event={"ID":"810beff6-dacb-486e-be5b-fc4ad06e12d3","Type":"ContainerStarted","Data":"f8aaca0343d752b480be23a84de751a3e5a7c58c72d0c9574d42f84b03791abe"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477100 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" event={"ID":"c087892e-22b2-4552-a57f-e1c1d75b7917","Type":"ContainerStarted","Data":"fc81bbe5311f0290d60ab419d326219258f32b2cc188d04ff0f9e6f22dc7597a"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477112 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" event={"ID":"ad8d13c6-f90b-4eb4-adce-1d20f690cc98","Type":"ContainerStarted","Data":"35cddfecc808c03067896e5ec8e87089a41d57dc64da83ebfc1c69180932bf8e"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477123 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" event={"ID":"d5b27da0-841c-49b1-b761-a9f61a402f6c","Type":"ContainerStarted","Data":"32d358764ffa70f44705e4474aeb17bf11e3b46a754b3d85aaab9bc6b70702f4"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477132 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" event={"ID":"77819c69-e0b5-4eb8-a124-fb1339701ccb","Type":"ContainerStarted","Data":"ff2ed13ddacbcf68f61224bd308e811cdd118e800c2c364f42feda129ace1afc"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" event={"ID":"9c83fbda-99d5-4661-9ca4-24008f71bb98","Type":"ContainerStarted","Data":"915b90ebb1f8da45591d5880a721e11e6f35c4598f63735d656af2310d28b1b2"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477157 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" event={"ID":"d742678a-b8a2-409a-932d-3b7002db7636","Type":"ContainerStarted","Data":"17a77adb1613d831fe425c2d38f019acd04ea6d0009fa29c018b3d6b42365c19"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.477168 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" event={"ID":"e4075ad0-d00e-4675-97e6-87e1d7e845d9","Type":"ContainerStarted","Data":"093f6133661752bcaf2451bce78dc47837f62376ad881e0a37da496e464fb917"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.480581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" event={"ID":"8c1d644e-a547-48c7-bda5-95cdb6c0220f","Type":"ContainerStarted","Data":"9eff2725e2368deadae9e46498b3536e251e986bb0de66ad7c16bf41431c4808"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.484183 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" event={"ID":"c18db346-860e-487c-b232-6f404fdb1b7c","Type":"ContainerStarted","Data":"39f4479dc1a823408941dc7c71814a11fb7b5ec742c0fcc53a09e447ebc5c681"} Mar 13 12:06:37 crc kubenswrapper[4786]: E0313 12:06:37.490686 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" podUID="c18db346-860e-487c-b232-6f404fdb1b7c" Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.493000 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" event={"ID":"52ab49ac-37a7-4ba5-a2c3-9113b6821a5d","Type":"ContainerStarted","Data":"1844feae2876fbb824c4a0d5b560527243e2def5e37ac30698c704d2cf3fecc4"} Mar 13 12:06:37 crc kubenswrapper[4786]: I0313 12:06:37.495526 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" event={"ID":"f53bfcff-fb8e-46d3-8818-39147c6ac29b","Type":"ContainerStarted","Data":"02f50ca6a36e4e35bca2d99143d07384a280072a2f08f6f058f0229925d6d5b5"} Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.515552 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" podUID="c087892e-22b2-4552-a57f-e1c1d75b7917" Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.515599 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" podUID="d742678a-b8a2-409a-932d-3b7002db7636" Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.515610 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" podUID="c18db346-860e-487c-b232-6f404fdb1b7c" Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.516075 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" podUID="77819c69-e0b5-4eb8-a124-fb1339701ccb" Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.516148 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" podUID="d5b27da0-841c-49b1-b761-a9f61a402f6c" Mar 13 12:06:38 crc kubenswrapper[4786]: I0313 12:06:38.961580 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.961754 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:38 crc kubenswrapper[4786]: E0313 12:06:38.961816 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert podName:784ee575-162b-4732-b82c-8f4b3c1e5317 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:42.961800508 +0000 UTC m=+1190.241453945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert") pod "infra-operator-controller-manager-5995f4446f-thr75" (UID: "784ee575-162b-4732-b82c-8f4b3c1e5317") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: I0313 12:06:39.062859 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.063040 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.063134 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert podName:3aeac64d-7cf0-407c-a460-423a0082a8e9 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:43.063110202 +0000 UTC m=+1190.342763639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" (UID: "3aeac64d-7cf0-407c-a460-423a0082a8e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: I0313 12:06:39.367454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:39 crc kubenswrapper[4786]: I0313 12:06:39.367598 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.367724 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.367767 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.367845 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:43.367813527 +0000 UTC m=+1190.647467014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "metrics-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.367876 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:43.367863088 +0000 UTC m=+1190.647516575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "webhook-server-cert" not found Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.519039 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" podUID="c087892e-22b2-4552-a57f-e1c1d75b7917" Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.520372 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" podUID="d742678a-b8a2-409a-932d-3b7002db7636" Mar 13 12:06:39 crc kubenswrapper[4786]: E0313 12:06:39.520822 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" podUID="d5b27da0-841c-49b1-b761-a9f61a402f6c" Mar 13 12:06:40 crc kubenswrapper[4786]: I0313 12:06:40.180083 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-vv77w" podUID="ae33a694-0398-4129-9926-1b6dcb6ecc40" containerName="registry-server" probeResult="failure" output=< Mar 13 12:06:40 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:06:40 crc kubenswrapper[4786]: > Mar 13 12:06:40 crc kubenswrapper[4786]: I0313 12:06:40.183617 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-vv77w" podUID="ae33a694-0398-4129-9926-1b6dcb6ecc40" containerName="registry-server" probeResult="failure" output=< Mar 13 12:06:40 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:06:40 crc kubenswrapper[4786]: > Mar 13 12:06:43 crc kubenswrapper[4786]: I0313 12:06:43.005978 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.006157 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.006511 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert podName:784ee575-162b-4732-b82c-8f4b3c1e5317 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:51.00648281 +0000 UTC m=+1198.286136267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert") pod "infra-operator-controller-manager-5995f4446f-thr75" (UID: "784ee575-162b-4732-b82c-8f4b3c1e5317") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: I0313 12:06:43.107508 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.107685 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.107754 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert podName:3aeac64d-7cf0-407c-a460-423a0082a8e9 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:51.107734383 +0000 UTC m=+1198.387387830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" (UID: "3aeac64d-7cf0-407c-a460-423a0082a8e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: I0313 12:06:43.412083 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:43 crc kubenswrapper[4786]: I0313 12:06:43.412220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.412262 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.412350 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:51.412332534 +0000 UTC m=+1198.691985981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "metrics-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.412386 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:06:43 crc kubenswrapper[4786]: E0313 12:06:43.412474 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:06:51.412451037 +0000 UTC m=+1198.692104564 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "webhook-server-cert" not found Mar 13 12:06:48 crc kubenswrapper[4786]: E0313 12:06:48.527304 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 13 12:06:48 crc kubenswrapper[4786]: E0313 12:06:48.528123 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rsqk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-stg56_openstack-operators(1073de3d-8bac-4236-a9ce-c78d7bb2865b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:48 crc kubenswrapper[4786]: E0313 12:06:48.529320 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" podUID="1073de3d-8bac-4236-a9ce-c78d7bb2865b" Mar 13 12:06:48 crc kubenswrapper[4786]: E0313 12:06:48.588280 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" podUID="1073de3d-8bac-4236-a9ce-c78d7bb2865b" Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.027937 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:06:51 crc kubenswrapper[4786]: E0313 12:06:51.028100 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:51 crc kubenswrapper[4786]: E0313 12:06:51.028412 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert podName:784ee575-162b-4732-b82c-8f4b3c1e5317 nodeName:}" failed. No retries permitted until 2026-03-13 12:07:07.02838489 +0000 UTC m=+1214.308038397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert") pod "infra-operator-controller-manager-5995f4446f-thr75" (UID: "784ee575-162b-4732-b82c-8f4b3c1e5317") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.129856 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.284000 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3aeac64d-7cf0-407c-a460-423a0082a8e9-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b\" (UID: \"3aeac64d-7cf0-407c-a460-423a0082a8e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.288704 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.433236 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.433322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:51 crc kubenswrapper[4786]: E0313 12:06:51.433406 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:06:51 crc kubenswrapper[4786]: E0313 12:06:51.433452 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs podName:9b7d27b4-b437-4bfb-b888-97b406ceb185 nodeName:}" failed. No retries permitted until 2026-03-13 12:07:07.433437643 +0000 UTC m=+1214.713091090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-5x2pg" (UID: "9b7d27b4-b437-4bfb-b888-97b406ceb185") : secret "metrics-server-cert" not found Mar 13 12:06:51 crc kubenswrapper[4786]: I0313 12:06:51.446169 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:06:55 crc kubenswrapper[4786]: E0313 12:06:55.485046 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 13 12:06:55 crc kubenswrapper[4786]: E0313 12:06:55.486176 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jd7xq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-l2vgk_openstack-operators(52ab49ac-37a7-4ba5-a2c3-9113b6821a5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:55 crc kubenswrapper[4786]: E0313 12:06:55.488124 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" podUID="52ab49ac-37a7-4ba5-a2c3-9113b6821a5d" Mar 13 12:06:55 crc kubenswrapper[4786]: E0313 12:06:55.646927 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" podUID="52ab49ac-37a7-4ba5-a2c3-9113b6821a5d" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.057059 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.057241 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpzl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-rdph5_openstack-operators(8c1d644e-a547-48c7-bda5-95cdb6c0220f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.058984 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" podUID="8c1d644e-a547-48c7-bda5-95cdb6c0220f" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.656682 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" podUID="8c1d644e-a547-48c7-bda5-95cdb6c0220f" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.846974 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.847147 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rzg8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-68f45f9d9f-6dcrg_openstack-operators(ad8d13c6-f90b-4eb4-adce-1d20f690cc98): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:56 crc kubenswrapper[4786]: E0313 12:06:56.848524 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" podUID="ad8d13c6-f90b-4eb4-adce-1d20f690cc98" Mar 13 12:06:57 crc kubenswrapper[4786]: E0313 12:06:57.532472 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 12:06:57 crc kubenswrapper[4786]: E0313 12:06:57.533105 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26879,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-lj4bv_openstack-operators(f87ad580-b279-47e4-8fdd-462285c7bead): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:57 crc kubenswrapper[4786]: E0313 12:06:57.534332 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" podUID="f87ad580-b279-47e4-8fdd-462285c7bead" Mar 13 12:06:57 crc kubenswrapper[4786]: E0313 12:06:57.662453 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" podUID="f87ad580-b279-47e4-8fdd-462285c7bead" Mar 13 12:06:57 crc kubenswrapper[4786]: E0313 12:06:57.662755 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" podUID="ad8d13c6-f90b-4eb4-adce-1d20f690cc98" Mar 13 12:07:00 crc kubenswrapper[4786]: I0313 12:07:00.935624 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b"] Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.702442 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" event={"ID":"3aeac64d-7cf0-407c-a460-423a0082a8e9","Type":"ContainerStarted","Data":"4752fbfd9e0cbe8fd5df863f48ef8d856425b822a216956a759d6bc8180a1ad2"} Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.706596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" event={"ID":"17f6df05-f37f-4863-b967-7b27429282f2","Type":"ContainerStarted","Data":"3075902d3f6ead49ebd7a6c424dce04b81a2097300e94b26d614b2525b2ab8ab"} Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.706708 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.711236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" event={"ID":"47c63b16-0044-4bef-848e-084b958e853b","Type":"ContainerStarted","Data":"3850876cc7026048aa1b012de59918099554c4b0e8067b28c402048e9ac9430d"} Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.711480 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.713900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" event={"ID":"4eb75275-4f14-406c-950a-fa40061041af","Type":"ContainerStarted","Data":"8788f05c4d7277950f6157e9a97fe5bb3b49a7ee7950c54ad2afcce3ce1bf726"} Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.714449 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.716342 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.745384 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" podStartSLOduration=6.032648337 podStartE2EDuration="27.745366398s" podCreationTimestamp="2026-03-13 12:06:34 +0000 UTC" firstStartedPulling="2026-03-13 12:06:35.828703291 +0000 UTC m=+1183.108356738" lastFinishedPulling="2026-03-13 12:06:57.541421362 +0000 UTC m=+1204.821074799" observedRunningTime="2026-03-13 12:07:01.743141688 +0000 UTC m=+1209.022795135" watchObservedRunningTime="2026-03-13 12:07:01.745366398 +0000 UTC m=+1209.025019845" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.752609 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" podStartSLOduration=6.614906394 podStartE2EDuration="27.752591632s" podCreationTimestamp="2026-03-13 12:06:34 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.404064754 +0000 UTC m=+1183.683718201" lastFinishedPulling="2026-03-13 12:06:57.541749992 +0000 UTC m=+1204.821403439" observedRunningTime="2026-03-13 12:07:01.72573288 +0000 UTC m=+1209.005386327" watchObservedRunningTime="2026-03-13 12:07:01.752591632 +0000 UTC m=+1209.032245099" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.762365 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" podStartSLOduration=6.641790777 podStartE2EDuration="27.762348245s" podCreationTimestamp="2026-03-13 12:06:34 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.388067573 +0000 UTC m=+1183.667721020" lastFinishedPulling="2026-03-13 12:06:57.508625041 +0000 UTC m=+1204.788278488" observedRunningTime="2026-03-13 12:07:01.759782565 +0000 UTC m=+1209.039436022" watchObservedRunningTime="2026-03-13 12:07:01.762348245 +0000 UTC m=+1209.042001702" Mar 13 12:07:01 crc kubenswrapper[4786]: I0313 12:07:01.786154 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" podStartSLOduration=6.235530202 podStartE2EDuration="27.786134604s" podCreationTimestamp="2026-03-13 12:06:34 +0000 UTC" firstStartedPulling="2026-03-13 12:06:35.958852981 +0000 UTC m=+1183.238506428" lastFinishedPulling="2026-03-13 12:06:57.509457383 +0000 UTC m=+1204.789110830" observedRunningTime="2026-03-13 12:07:01.78337039 +0000 UTC m=+1209.063023847" watchObservedRunningTime="2026-03-13 12:07:01.786134604 +0000 UTC m=+1209.065788071" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.723791 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" event={"ID":"f53bfcff-fb8e-46d3-8818-39147c6ac29b","Type":"ContainerStarted","Data":"50e92a5209e3978e0b133dad047ed48fdec8a9cef90fda59b8746eab6e064ca7"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.724874 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.727427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" event={"ID":"c087892e-22b2-4552-a57f-e1c1d75b7917","Type":"ContainerStarted","Data":"31935c2272f099c59362fd7d5f8ad874b7123280a0826da5c75cfae18c66a2d1"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.729330 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" event={"ID":"c18db346-860e-487c-b232-6f404fdb1b7c","Type":"ContainerStarted","Data":"7351ceee54e34d4962a9a2b2bd31354c661826401c399043bae1d4522861549d"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.729709 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.730848 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" event={"ID":"77819c69-e0b5-4eb8-a124-fb1339701ccb","Type":"ContainerStarted","Data":"3a516c014f93f5bc59dd3c447c6e65ec92768f3e63f56810b28f9e683b74e0d4"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.731264 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.732276 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" event={"ID":"8902cfaa-8c11-4e52-9f6d-d579e6cd50f5","Type":"ContainerStarted","Data":"f86ae6f81660233717bb2a7a4b36633511e708d3281729a3f20e370d4ea8170d"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.732691 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.734206 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" event={"ID":"810beff6-dacb-486e-be5b-fc4ad06e12d3","Type":"ContainerStarted","Data":"5018ee5f375cd895d76837fd49783ad7c5b6ed1689e5490556ddf36aff6ec8eb"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.734383 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.736096 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" event={"ID":"e4075ad0-d00e-4675-97e6-87e1d7e845d9","Type":"ContainerStarted","Data":"490cab36f38ded3e96866e93321ffa3bc5432514776f77375e34134bf1b033c4"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.736291 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.737303 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" event={"ID":"8ba2beff-196b-4a24-a490-86a81b9f7495","Type":"ContainerStarted","Data":"df740cd3c96a6a3145949704e747bd914a186820f33a44d713a0d608e348cbea"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.737812 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.744408 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" event={"ID":"28f6ab30-9436-45e7-a94f-b9757e0dc331","Type":"ContainerStarted","Data":"5d007d85be95eb60e1a431ed62e70e7a6c8cf688d9b2621865afd954dd6c48f9"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.745063 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" podStartSLOduration=7.031374477 podStartE2EDuration="27.745047822s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.828079967 +0000 UTC m=+1184.107733404" lastFinishedPulling="2026-03-13 12:06:57.541753302 +0000 UTC m=+1204.821406749" observedRunningTime="2026-03-13 12:07:02.741226029 +0000 UTC m=+1210.020879476" watchObservedRunningTime="2026-03-13 12:07:02.745047822 +0000 UTC m=+1210.024701269" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.749297 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" event={"ID":"d5b27da0-841c-49b1-b761-a9f61a402f6c","Type":"ContainerStarted","Data":"de9989df4cdd97179499d653fb3f3dab0138f311721480203039e80d0e87cc12"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.749548 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.753831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" event={"ID":"d742678a-b8a2-409a-932d-3b7002db7636","Type":"ContainerStarted","Data":"7f87b19fd43ee0b79385659cee4261782f3ddd217d4a79831b2b7e84dae54114"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.754479 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.761661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" event={"ID":"9c83fbda-99d5-4661-9ca4-24008f71bb98","Type":"ContainerStarted","Data":"3d60d1cb3cc37931a2e7140f3e6c2f05006e1626507657ccb40508dc1aba1320"} Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.761704 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.782975 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" podStartSLOduration=4.561243599 podStartE2EDuration="27.782960032s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.517756301 +0000 UTC m=+1183.797409748" lastFinishedPulling="2026-03-13 12:06:59.739472734 +0000 UTC m=+1207.019126181" observedRunningTime="2026-03-13 12:07:02.780936547 +0000 UTC m=+1210.060589994" watchObservedRunningTime="2026-03-13 12:07:02.782960032 +0000 UTC m=+1210.062613479" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.799503 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" podStartSLOduration=6.7347750699999995 podStartE2EDuration="27.799485536s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.477040796 +0000 UTC m=+1183.756694243" lastFinishedPulling="2026-03-13 12:06:57.541751262 +0000 UTC m=+1204.821404709" observedRunningTime="2026-03-13 12:07:02.794663696 +0000 UTC m=+1210.074317133" watchObservedRunningTime="2026-03-13 12:07:02.799485536 +0000 UTC m=+1210.079138983" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.854739 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" podStartSLOduration=3.425138756 podStartE2EDuration="27.854722241s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.859736298 +0000 UTC m=+1184.139389745" lastFinishedPulling="2026-03-13 12:07:01.289319793 +0000 UTC m=+1208.568973230" observedRunningTime="2026-03-13 12:07:02.854222318 +0000 UTC m=+1210.133875785" watchObservedRunningTime="2026-03-13 12:07:02.854722241 +0000 UTC m=+1210.134375688" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.892292 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" podStartSLOduration=5.007052517 podStartE2EDuration="27.892273321s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.853788278 +0000 UTC m=+1184.133441715" lastFinishedPulling="2026-03-13 12:06:59.739009032 +0000 UTC m=+1207.018662519" observedRunningTime="2026-03-13 12:07:02.812191268 +0000 UTC m=+1210.091844715" watchObservedRunningTime="2026-03-13 12:07:02.892273321 +0000 UTC m=+1210.171926788" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.928694 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-22b2x" podStartSLOduration=3.402003874 podStartE2EDuration="27.92867765s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.86092186 +0000 UTC m=+1184.140575307" lastFinishedPulling="2026-03-13 12:07:01.387595636 +0000 UTC m=+1208.667249083" observedRunningTime="2026-03-13 12:07:02.903584375 +0000 UTC m=+1210.183237832" watchObservedRunningTime="2026-03-13 12:07:02.92867765 +0000 UTC m=+1210.208331097" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.930603 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" podStartSLOduration=7.5092897579999995 podStartE2EDuration="28.930595102s" podCreationTimestamp="2026-03-13 12:06:34 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.087389458 +0000 UTC m=+1183.367042905" lastFinishedPulling="2026-03-13 12:06:57.508694802 +0000 UTC m=+1204.788348249" observedRunningTime="2026-03-13 12:07:02.9275491 +0000 UTC m=+1210.207202547" watchObservedRunningTime="2026-03-13 12:07:02.930595102 +0000 UTC m=+1210.210248549" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.955579 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" podStartSLOduration=4.528090614 podStartE2EDuration="28.955562833s" podCreationTimestamp="2026-03-13 12:06:34 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.86315184 +0000 UTC m=+1184.142805287" lastFinishedPulling="2026-03-13 12:07:01.290624059 +0000 UTC m=+1208.570277506" observedRunningTime="2026-03-13 12:07:02.951913666 +0000 UTC m=+1210.231567123" watchObservedRunningTime="2026-03-13 12:07:02.955562833 +0000 UTC m=+1210.235216270" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.978225 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" podStartSLOduration=3.5488382019999998 podStartE2EDuration="27.978207702s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.860966081 +0000 UTC m=+1184.140619528" lastFinishedPulling="2026-03-13 12:07:01.290335581 +0000 UTC m=+1208.569989028" observedRunningTime="2026-03-13 12:07:02.976940608 +0000 UTC m=+1210.256594065" watchObservedRunningTime="2026-03-13 12:07:02.978207702 +0000 UTC m=+1210.257861149" Mar 13 12:07:02 crc kubenswrapper[4786]: I0313 12:07:02.998583 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" podStartSLOduration=3.570502385 podStartE2EDuration="27.99856887s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.861278749 +0000 UTC m=+1184.140932196" lastFinishedPulling="2026-03-13 12:07:01.289345234 +0000 UTC m=+1208.568998681" observedRunningTime="2026-03-13 12:07:02.996481223 +0000 UTC m=+1210.276134680" watchObservedRunningTime="2026-03-13 12:07:02.99856887 +0000 UTC m=+1210.278222317" Mar 13 12:07:03 crc kubenswrapper[4786]: I0313 12:07:03.029269 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" podStartSLOduration=5.144094843 podStartE2EDuration="28.029251155s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.853761037 +0000 UTC m=+1184.133414484" lastFinishedPulling="2026-03-13 12:06:59.738917349 +0000 UTC m=+1207.018570796" observedRunningTime="2026-03-13 12:07:03.02832592 +0000 UTC m=+1210.307979357" watchObservedRunningTime="2026-03-13 12:07:03.029251155 +0000 UTC m=+1210.308904602" Mar 13 12:07:04 crc kubenswrapper[4786]: I0313 12:07:04.778972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" event={"ID":"3aeac64d-7cf0-407c-a460-423a0082a8e9","Type":"ContainerStarted","Data":"88cdc24409f2ef8f1344531d41603c7af4e67d8cb58fe60f806e346405e6682e"} Mar 13 12:07:04 crc kubenswrapper[4786]: I0313 12:07:04.780292 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:07:04 crc kubenswrapper[4786]: I0313 12:07:04.782580 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" event={"ID":"1073de3d-8bac-4236-a9ce-c78d7bb2865b","Type":"ContainerStarted","Data":"95e626e3d07ac48c63a8191fa6eabfac63b6bef98f816d10fdecd97f4d2bfb1f"} Mar 13 12:07:04 crc kubenswrapper[4786]: I0313 12:07:04.783128 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:07:04 crc kubenswrapper[4786]: I0313 12:07:04.813744 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" podStartSLOduration=26.635544974 podStartE2EDuration="29.813727584s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:07:01.302087867 +0000 UTC m=+1208.581741314" lastFinishedPulling="2026-03-13 12:07:04.480270467 +0000 UTC m=+1211.759923924" observedRunningTime="2026-03-13 12:07:04.812564224 +0000 UTC m=+1212.092217681" watchObservedRunningTime="2026-03-13 12:07:04.813727584 +0000 UTC m=+1212.093381041" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.101216 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.113738 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/784ee575-162b-4732-b82c-8f4b3c1e5317-cert\") pod \"infra-operator-controller-manager-5995f4446f-thr75\" (UID: \"784ee575-162b-4732-b82c-8f4b3c1e5317\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.292968 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d9kpz" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.302179 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.506182 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.511762 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b7d27b4-b437-4bfb-b888-97b406ceb185-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-5x2pg\" (UID: \"9b7d27b4-b437-4bfb-b888-97b406ceb185\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.546241 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" podStartSLOduration=4.560852688 podStartE2EDuration="32.546221239s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.492978584 +0000 UTC m=+1183.772632021" lastFinishedPulling="2026-03-13 12:07:04.478347115 +0000 UTC m=+1211.758000572" observedRunningTime="2026-03-13 12:07:04.832375656 +0000 UTC m=+1212.112029223" watchObservedRunningTime="2026-03-13 12:07:07.546221239 +0000 UTC m=+1214.825874686" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.547476 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-thr75"] Mar 13 12:07:07 crc kubenswrapper[4786]: W0313 12:07:07.551236 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784ee575_162b_4732_b82c_8f4b3c1e5317.slice/crio-27608d0986a2a275d95b6765014a55b483050ffecda9294aa668d99e89915e97 WatchSource:0}: Error finding container 27608d0986a2a275d95b6765014a55b483050ffecda9294aa668d99e89915e97: Status 404 returned error can't find the container with id 27608d0986a2a275d95b6765014a55b483050ffecda9294aa668d99e89915e97 Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.805649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" event={"ID":"784ee575-162b-4732-b82c-8f4b3c1e5317","Type":"ContainerStarted","Data":"27608d0986a2a275d95b6765014a55b483050ffecda9294aa668d99e89915e97"} Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.806585 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ftp9m" Mar 13 12:07:07 crc kubenswrapper[4786]: I0313 12:07:07.814149 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.088631 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg"] Mar 13 12:07:08 crc kubenswrapper[4786]: W0313 12:07:08.100629 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b7d27b4_b437_4bfb_b888_97b406ceb185.slice/crio-cc4c4f3f86052dc3406d2e4a0cf326652ae57282f9856cc9c2fee495302d1929 WatchSource:0}: Error finding container cc4c4f3f86052dc3406d2e4a0cf326652ae57282f9856cc9c2fee495302d1929: Status 404 returned error can't find the container with id cc4c4f3f86052dc3406d2e4a0cf326652ae57282f9856cc9c2fee495302d1929 Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.170472 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.170538 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.824947 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" event={"ID":"9b7d27b4-b437-4bfb-b888-97b406ceb185","Type":"ContainerStarted","Data":"c6082986055eeddeaa31718578ef8f27d0d90bd84babcbe414bc482dbf7058ed"} Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.825003 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" event={"ID":"9b7d27b4-b437-4bfb-b888-97b406ceb185","Type":"ContainerStarted","Data":"cc4c4f3f86052dc3406d2e4a0cf326652ae57282f9856cc9c2fee495302d1929"} Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.825113 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:07:08 crc kubenswrapper[4786]: I0313 12:07:08.860154 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" podStartSLOduration=33.860133613 podStartE2EDuration="33.860133613s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:08.856186577 +0000 UTC m=+1216.135840044" watchObservedRunningTime="2026-03-13 12:07:08.860133613 +0000 UTC m=+1216.139787060" Mar 13 12:07:09 crc kubenswrapper[4786]: I0313 12:07:09.835745 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" event={"ID":"52ab49ac-37a7-4ba5-a2c3-9113b6821a5d","Type":"ContainerStarted","Data":"6564fb5a2f200874fff83c545b4b09d137034c3f74291ef38b5c41e05391e70d"} Mar 13 12:07:09 crc kubenswrapper[4786]: I0313 12:07:09.836349 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:07:09 crc kubenswrapper[4786]: I0313 12:07:09.856478 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" podStartSLOduration=2.847565194 podStartE2EDuration="34.856446617s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.85647513 +0000 UTC m=+1184.136128577" lastFinishedPulling="2026-03-13 12:07:08.865356553 +0000 UTC m=+1216.145010000" observedRunningTime="2026-03-13 12:07:09.850571249 +0000 UTC m=+1217.130224726" watchObservedRunningTime="2026-03-13 12:07:09.856446617 +0000 UTC m=+1217.136100134" Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.845202 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" event={"ID":"784ee575-162b-4732-b82c-8f4b3c1e5317","Type":"ContainerStarted","Data":"bc817038fe14d88ec6b5a99c999a454e50e2c233233c246985b316e925871646"} Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.845613 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.847073 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" event={"ID":"8c1d644e-a547-48c7-bda5-95cdb6c0220f","Type":"ContainerStarted","Data":"8946c415cd1a7207faa16bd792eea4f4e4d6fbe07daaac2a1a16c3ca12a7dab8"} Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.848029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.858173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" event={"ID":"f87ad580-b279-47e4-8fdd-462285c7bead","Type":"ContainerStarted","Data":"07033fecacbc72398e960bdeb7a6899422ffe3a5d659f969056164c519dc25cf"} Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.858556 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.882281 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" podStartSLOduration=33.500161444 podStartE2EDuration="35.882249064s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:07:07.556915027 +0000 UTC m=+1214.836568474" lastFinishedPulling="2026-03-13 12:07:09.939002647 +0000 UTC m=+1217.218656094" observedRunningTime="2026-03-13 12:07:10.875589104 +0000 UTC m=+1218.155242641" watchObservedRunningTime="2026-03-13 12:07:10.882249064 +0000 UTC m=+1218.161902551" Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.897716 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" podStartSLOduration=2.69572096 podStartE2EDuration="35.896865686s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.832481235 +0000 UTC m=+1184.112134682" lastFinishedPulling="2026-03-13 12:07:10.033625961 +0000 UTC m=+1217.313279408" observedRunningTime="2026-03-13 12:07:10.893893496 +0000 UTC m=+1218.173546943" watchObservedRunningTime="2026-03-13 12:07:10.896865686 +0000 UTC m=+1218.176519173" Mar 13 12:07:10 crc kubenswrapper[4786]: I0313 12:07:10.916016 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" podStartSLOduration=2.7747452040000002 podStartE2EDuration="35.915998831s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.7947587 +0000 UTC m=+1184.074412147" lastFinishedPulling="2026-03-13 12:07:09.936012317 +0000 UTC m=+1217.215665774" observedRunningTime="2026-03-13 12:07:10.912572858 +0000 UTC m=+1218.192226355" watchObservedRunningTime="2026-03-13 12:07:10.915998831 +0000 UTC m=+1218.195652278" Mar 13 12:07:11 crc kubenswrapper[4786]: I0313 12:07:11.295359 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b" Mar 13 12:07:11 crc kubenswrapper[4786]: I0313 12:07:11.868697 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" event={"ID":"ad8d13c6-f90b-4eb4-adce-1d20f690cc98","Type":"ContainerStarted","Data":"38c31f5165bc41011ccc29ccf187a0aae6403b694b17d8d6936d3a1f717fff5d"} Mar 13 12:07:11 crc kubenswrapper[4786]: I0313 12:07:11.892954 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" podStartSLOduration=2.780073067 podStartE2EDuration="36.892933623s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="2026-03-13 12:06:36.830966124 +0000 UTC m=+1184.110619571" lastFinishedPulling="2026-03-13 12:07:10.94382667 +0000 UTC m=+1218.223480127" observedRunningTime="2026-03-13 12:07:11.888005661 +0000 UTC m=+1219.167659118" watchObservedRunningTime="2026-03-13 12:07:11.892933623 +0000 UTC m=+1219.172587080" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.253762 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wrxdd" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.278127 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-652d5" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.288647 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-kgm9l" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.312413 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-wk47l" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.361901 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4jnjg" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.513762 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mf6t8" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.571996 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-stg56" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.652477 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-l2vgk" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.675830 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-fnchb" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.701350 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-4skd6" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.732663 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9w5pr" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.745546 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c284s" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.776856 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8qxfc" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.831910 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-lj4bv" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.842093 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.874820 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdph5" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.875309 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-gw46v" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.925191 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pl6k5" Mar 13 12:07:15 crc kubenswrapper[4786]: I0313 12:07:15.976575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4n8g4" Mar 13 12:07:17 crc kubenswrapper[4786]: I0313 12:07:17.311345 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-thr75" Mar 13 12:07:17 crc kubenswrapper[4786]: I0313 12:07:17.823629 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-5x2pg" Mar 13 12:07:25 crc kubenswrapper[4786]: I0313 12:07:25.844758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6dcrg" Mar 13 12:07:38 crc kubenswrapper[4786]: I0313 12:07:38.169152 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:07:38 crc kubenswrapper[4786]: I0313 12:07:38.169997 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.618914 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-84lsl"] Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.620369 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.622620 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.623721 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-c5zt4" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.623857 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.628524 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.640126 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-84lsl"] Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.702239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7440d8d4-0456-48ac-a688-a605c8e0c482-config\") pod \"dnsmasq-dns-5448ff6dc7-84lsl\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.702350 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fklln\" (UniqueName: \"kubernetes.io/projected/7440d8d4-0456-48ac-a688-a605c8e0c482-kube-api-access-fklln\") pod \"dnsmasq-dns-5448ff6dc7-84lsl\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.737956 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-584n2"] Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.766110 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.770196 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.771258 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-584n2"] Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.802975 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-config\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.803045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-dns-svc\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.803062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ztlj\" (UniqueName: \"kubernetes.io/projected/d92b379a-9d6d-4b24-b4c6-60af648de68b-kube-api-access-5ztlj\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.803103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7440d8d4-0456-48ac-a688-a605c8e0c482-config\") pod \"dnsmasq-dns-5448ff6dc7-84lsl\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.803139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fklln\" (UniqueName: \"kubernetes.io/projected/7440d8d4-0456-48ac-a688-a605c8e0c482-kube-api-access-fklln\") pod \"dnsmasq-dns-5448ff6dc7-84lsl\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.804283 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7440d8d4-0456-48ac-a688-a605c8e0c482-config\") pod \"dnsmasq-dns-5448ff6dc7-84lsl\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.829963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fklln\" (UniqueName: \"kubernetes.io/projected/7440d8d4-0456-48ac-a688-a605c8e0c482-kube-api-access-fklln\") pod \"dnsmasq-dns-5448ff6dc7-84lsl\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.903896 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-config\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.903985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-dns-svc\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.904004 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ztlj\" (UniqueName: \"kubernetes.io/projected/d92b379a-9d6d-4b24-b4c6-60af648de68b-kube-api-access-5ztlj\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.904638 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-config\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.904641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-dns-svc\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.920388 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ztlj\" (UniqueName: \"kubernetes.io/projected/d92b379a-9d6d-4b24-b4c6-60af648de68b-kube-api-access-5ztlj\") pod \"dnsmasq-dns-64696987c5-584n2\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:44 crc kubenswrapper[4786]: I0313 12:07:44.936026 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.095385 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.166030 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-84lsl"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.346043 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-84lsl"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.378473 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-9tdtc"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.381768 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.390856 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-9tdtc"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.410398 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.410447 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-config\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.410554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppxz\" (UniqueName: \"kubernetes.io/projected/3971b063-0530-44cd-9912-8c6c90128766-kube-api-access-5ppxz\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.512062 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.512126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-config\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.512231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppxz\" (UniqueName: \"kubernetes.io/projected/3971b063-0530-44cd-9912-8c6c90128766-kube-api-access-5ppxz\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.512972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.513036 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-config\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.531873 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppxz\" (UniqueName: \"kubernetes.io/projected/3971b063-0530-44cd-9912-8c6c90128766-kube-api-access-5ppxz\") pod \"dnsmasq-dns-854f47b4f9-9tdtc\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.546940 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-584n2"] Mar 13 12:07:45 crc kubenswrapper[4786]: W0313 12:07:45.550108 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd92b379a_9d6d_4b24_b4c6_60af648de68b.slice/crio-4b1c3471ad562997d253ba3c135960ae8a9508cc09b894744cb917ab5d648eeb WatchSource:0}: Error finding container 4b1c3471ad562997d253ba3c135960ae8a9508cc09b894744cb917ab5d648eeb: Status 404 returned error can't find the container with id 4b1c3471ad562997d253ba3c135960ae8a9508cc09b894744cb917ab5d648eeb Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.699004 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.781559 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-584n2"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.825429 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-rfc95"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.826932 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.831619 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-rfc95"] Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.916013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" event={"ID":"7440d8d4-0456-48ac-a688-a605c8e0c482","Type":"ContainerStarted","Data":"aca74cefabbbd5f75584ba7a0b363c26fe481e7dfbc356cadbef59cbf5a5ba0c"} Mar 13 12:07:45 crc kubenswrapper[4786]: I0313 12:07:45.919467 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-584n2" event={"ID":"d92b379a-9d6d-4b24-b4c6-60af648de68b","Type":"ContainerStarted","Data":"4b1c3471ad562997d253ba3c135960ae8a9508cc09b894744cb917ab5d648eeb"} Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.018848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.018950 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6wg\" (UniqueName: \"kubernetes.io/projected/4145a870-de38-47e9-a132-56500eb117fc-kube-api-access-kd6wg\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.018993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-config\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.121681 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.122076 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6wg\" (UniqueName: \"kubernetes.io/projected/4145a870-de38-47e9-a132-56500eb117fc-kube-api-access-kd6wg\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.122119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-config\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.122624 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.123035 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-config\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.142756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6wg\" (UniqueName: \"kubernetes.io/projected/4145a870-de38-47e9-a132-56500eb117fc-kube-api-access-kd6wg\") pod \"dnsmasq-dns-54b5dffb47-rfc95\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.173174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:07:46 crc kubenswrapper[4786]: W0313 12:07:46.227474 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3971b063_0530_44cd_9912_8c6c90128766.slice/crio-e388c6109d1020d03e655ae34cdb5732e3123dbf73574efbed9e7aa1ff72e624 WatchSource:0}: Error finding container e388c6109d1020d03e655ae34cdb5732e3123dbf73574efbed9e7aa1ff72e624: Status 404 returned error can't find the container with id e388c6109d1020d03e655ae34cdb5732e3123dbf73574efbed9e7aa1ff72e624 Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.229231 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-9tdtc"] Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.429015 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-rfc95"] Mar 13 12:07:46 crc kubenswrapper[4786]: W0313 12:07:46.459083 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4145a870_de38_47e9_a132_56500eb117fc.slice/crio-8cdf91bd739667f130d2af8574f6b7722a5d834aea6b05f72e65c80ceeb83614 WatchSource:0}: Error finding container 8cdf91bd739667f130d2af8574f6b7722a5d834aea6b05f72e65c80ceeb83614: Status 404 returned error can't find the container with id 8cdf91bd739667f130d2af8574f6b7722a5d834aea6b05f72e65c80ceeb83614 Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.508616 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.509750 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.511428 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vjk5q" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.512167 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.512535 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.512557 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.512587 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.512591 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.512664 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.528200 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.632517 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.632592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.632648 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633353 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drrl5\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-kube-api-access-drrl5\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633543 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633786 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.633873 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735175 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735245 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735280 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drrl5\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-kube-api-access-drrl5\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735346 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735383 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735431 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735456 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735494 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735546 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.735988 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.736769 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.736969 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.737757 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.738311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.739037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.740830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.742383 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.744957 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.752609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.755286 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drrl5\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-kube-api-access-drrl5\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.774749 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.848999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.931605 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" event={"ID":"3971b063-0530-44cd-9912-8c6c90128766","Type":"ContainerStarted","Data":"e388c6109d1020d03e655ae34cdb5732e3123dbf73574efbed9e7aa1ff72e624"} Mar 13 12:07:46 crc kubenswrapper[4786]: I0313 12:07:46.935083 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" event={"ID":"4145a870-de38-47e9-a132-56500eb117fc","Type":"ContainerStarted","Data":"8cdf91bd739667f130d2af8574f6b7722a5d834aea6b05f72e65c80ceeb83614"} Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.011697 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.013356 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.024855 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.025072 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.025137 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.025190 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.025328 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.025497 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.025662 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cp298" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.057638 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.141812 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142096 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142227 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142296 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnmkb\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-kube-api-access-mnmkb\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142397 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142482 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142556 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142806 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.142970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245241 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245305 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245710 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmkb\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-kube-api-access-mnmkb\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245745 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245783 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245831 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245857 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245914 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245947 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.245953 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.246496 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.246790 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.247219 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.247248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.247543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.248088 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.250011 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.256917 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.258631 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.262690 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.278210 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnmkb\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-kube-api-access-mnmkb\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.289441 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.340786 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.689999 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:07:47 crc kubenswrapper[4786]: W0313 12:07:47.713788 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53fea24b_7ca8_4c0a_96d1_458ca1e877a7.slice/crio-ab3c6791a13213692e74230d9319ea4b6b280cd4c58b66916b1745cd9aa92039 WatchSource:0}: Error finding container ab3c6791a13213692e74230d9319ea4b6b280cd4c58b66916b1745cd9aa92039: Status 404 returned error can't find the container with id ab3c6791a13213692e74230d9319ea4b6b280cd4c58b66916b1745cd9aa92039 Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.737005 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.951220 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53fea24b-7ca8-4c0a-96d1-458ca1e877a7","Type":"ContainerStarted","Data":"ab3c6791a13213692e74230d9319ea4b6b280cd4c58b66916b1745cd9aa92039"} Mar 13 12:07:47 crc kubenswrapper[4786]: I0313 12:07:47.953739 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b196d91-2a1f-4ee5-81d5-0133f2917cc5","Type":"ContainerStarted","Data":"cfcd50ddcccf4394896df1c362f592c8dd9d7067ac5cac9cf40148279326b5db"} Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.524182 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.525700 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.534729 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gxxbl" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.536131 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.536381 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.537453 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.549600 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.551427 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.675923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.675963 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.676002 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-config-data-default\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.676216 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/258afae9-f870-4f49-8102-3f987302da26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.676304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7nxn\" (UniqueName: \"kubernetes.io/projected/258afae9-f870-4f49-8102-3f987302da26-kube-api-access-t7nxn\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.676352 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.676401 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-kolla-config\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.676418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/258afae9-f870-4f49-8102-3f987302da26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778382 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7nxn\" (UniqueName: \"kubernetes.io/projected/258afae9-f870-4f49-8102-3f987302da26-kube-api-access-t7nxn\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778402 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778432 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-kolla-config\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778449 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778479 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.778529 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-config-data-default\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.779346 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-config-data-default\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.779968 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-kolla-config\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.780383 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/258afae9-f870-4f49-8102-3f987302da26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.781721 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.783993 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.784871 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.800269 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.806903 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7nxn\" (UniqueName: \"kubernetes.io/projected/258afae9-f870-4f49-8102-3f987302da26-kube-api-access-t7nxn\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.830740 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " pod="openstack/openstack-galera-0" Mar 13 12:07:48 crc kubenswrapper[4786]: I0313 12:07:48.848408 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.366951 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.830940 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.832285 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.834175 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.834994 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mcv9x" Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.835219 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.835374 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 12:07:49 crc kubenswrapper[4786]: I0313 12:07:49.850093 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.002965 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"258afae9-f870-4f49-8102-3f987302da26","Type":"ContainerStarted","Data":"2f3ff7be82f53ef853033fb0112d325d3a9220de25c8891d5743a1aab4438220"} Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019429 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019493 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019531 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019570 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019637 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbgn4\" (UniqueName: \"kubernetes.io/projected/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kube-api-access-kbgn4\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019665 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.019689 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121304 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121357 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121380 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121417 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbgn4\" (UniqueName: \"kubernetes.io/projected/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kube-api-access-kbgn4\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121445 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121458 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121509 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121551 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.121661 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.123424 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.128769 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.137949 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.138100 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.143709 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.144618 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.145623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.146490 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.148518 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.176198 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r7zx7" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.176269 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.176460 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.188447 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.225316 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.225643 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.225686 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kolla-config\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.225739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-config-data\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.225764 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdbz\" (UniqueName: \"kubernetes.io/projected/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kube-api-access-hbdbz\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.230835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbgn4\" (UniqueName: \"kubernetes.io/projected/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kube-api-access-kbgn4\") pod \"openstack-cell1-galera-0\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.327533 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-config-data\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.327585 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdbz\" (UniqueName: \"kubernetes.io/projected/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kube-api-access-hbdbz\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.327632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.327686 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.327717 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kolla-config\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.328366 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-config-data\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.328997 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kolla-config\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.333361 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.348649 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.352115 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdbz\" (UniqueName: \"kubernetes.io/projected/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kube-api-access-hbdbz\") pod \"memcached-0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " pod="openstack/memcached-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.470386 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:07:50 crc kubenswrapper[4786]: I0313 12:07:50.574147 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:07:51 crc kubenswrapper[4786]: I0313 12:07:51.036095 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:07:51 crc kubenswrapper[4786]: W0313 12:07:51.047940 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b28a544_1e5f_46f0_a6d9_7a147c5d737e.slice/crio-b2ccbd80ec561cb0121e0d92ad8395ad6bf7bdc1ad86a2abd15af0803bba6f61 WatchSource:0}: Error finding container b2ccbd80ec561cb0121e0d92ad8395ad6bf7bdc1ad86a2abd15af0803bba6f61: Status 404 returned error can't find the container with id b2ccbd80ec561cb0121e0d92ad8395ad6bf7bdc1ad86a2abd15af0803bba6f61 Mar 13 12:07:51 crc kubenswrapper[4786]: I0313 12:07:51.147055 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 12:07:51 crc kubenswrapper[4786]: W0313 12:07:51.169932 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4f5bdf5_c352_4722_bcbd_704965ab36f0.slice/crio-2ca966fc1d68b16911c94352fe111f31c45089d9f16ba2c5cec8e8282997d80e WatchSource:0}: Error finding container 2ca966fc1d68b16911c94352fe111f31c45089d9f16ba2c5cec8e8282997d80e: Status 404 returned error can't find the container with id 2ca966fc1d68b16911c94352fe111f31c45089d9f16ba2c5cec8e8282997d80e Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.029890 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b28a544-1e5f-46f0-a6d9-7a147c5d737e","Type":"ContainerStarted","Data":"b2ccbd80ec561cb0121e0d92ad8395ad6bf7bdc1ad86a2abd15af0803bba6f61"} Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.032220 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e4f5bdf5-c352-4722-bcbd-704965ab36f0","Type":"ContainerStarted","Data":"2ca966fc1d68b16911c94352fe111f31c45089d9f16ba2c5cec8e8282997d80e"} Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.281148 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.282277 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.284426 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-blxqq" Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.292494 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.359104 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2rt\" (UniqueName: \"kubernetes.io/projected/befbdb06-def2-49f6-83c8-c3a84dd09334-kube-api-access-lt2rt\") pod \"kube-state-metrics-0\" (UID: \"befbdb06-def2-49f6-83c8-c3a84dd09334\") " pod="openstack/kube-state-metrics-0" Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.460784 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2rt\" (UniqueName: \"kubernetes.io/projected/befbdb06-def2-49f6-83c8-c3a84dd09334-kube-api-access-lt2rt\") pod \"kube-state-metrics-0\" (UID: \"befbdb06-def2-49f6-83c8-c3a84dd09334\") " pod="openstack/kube-state-metrics-0" Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.477869 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2rt\" (UniqueName: \"kubernetes.io/projected/befbdb06-def2-49f6-83c8-c3a84dd09334-kube-api-access-lt2rt\") pod \"kube-state-metrics-0\" (UID: \"befbdb06-def2-49f6-83c8-c3a84dd09334\") " pod="openstack/kube-state-metrics-0" Mar 13 12:07:52 crc kubenswrapper[4786]: I0313 12:07:52.607012 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.107859 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.109451 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.114300 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.114331 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gb6rs" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.114592 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.114698 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.114891 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.125938 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.222873 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.222922 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqtc5\" (UniqueName: \"kubernetes.io/projected/3a37be46-7b90-4c56-8dcf-a3ea45123df8-kube-api-access-dqtc5\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.222946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.222972 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.223006 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.223045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.223060 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.223076 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324731 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324788 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqtc5\" (UniqueName: \"kubernetes.io/projected/3a37be46-7b90-4c56-8dcf-a3ea45123df8-kube-api-access-dqtc5\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324818 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324849 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324906 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324976 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.324996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.325500 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.326070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.327020 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.327339 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.331494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.333656 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.335872 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.357025 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.358123 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqtc5\" (UniqueName: \"kubernetes.io/projected/3a37be46-7b90-4c56-8dcf-a3ea45123df8-kube-api-access-dqtc5\") pod \"ovsdbserver-nb-0\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:56 crc kubenswrapper[4786]: I0313 12:07:56.440201 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.853639 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-666xn"] Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.854768 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.856896 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.857973 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.869956 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-t9b6r" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.871293 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn"] Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.882698 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tpch6"] Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.884325 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.925635 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tpch6"] Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952329 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b03b506e-7150-4904-b58b-8e442885af50-scripts\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952443 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/187d55eb-db2f-4935-91cc-8ef51895a35a-scripts\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952477 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-log\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952598 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-combined-ca-bundle\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-etc-ovs\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952683 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-log-ovn\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952737 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run-ovn\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-run\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952794 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-lib\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwx8k\" (UniqueName: \"kubernetes.io/projected/b03b506e-7150-4904-b58b-8e442885af50-kube-api-access-hwx8k\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952873 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/187d55eb-db2f-4935-91cc-8ef51895a35a-kube-api-access-pgk4j\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:57 crc kubenswrapper[4786]: I0313 12:07:57.952926 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-ovn-controller-tls-certs\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.054450 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/187d55eb-db2f-4935-91cc-8ef51895a35a-scripts\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.054545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-log\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.054575 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.054961 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-combined-ca-bundle\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.054990 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-etc-ovs\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055013 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-log-ovn\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055033 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run-ovn\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055065 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-run\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055080 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-lib\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwx8k\" (UniqueName: \"kubernetes.io/projected/b03b506e-7150-4904-b58b-8e442885af50-kube-api-access-hwx8k\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/187d55eb-db2f-4935-91cc-8ef51895a35a-kube-api-access-pgk4j\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055167 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-ovn-controller-tls-certs\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b03b506e-7150-4904-b58b-8e442885af50-scripts\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055977 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.055977 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-run\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.056028 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-log\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.056085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run-ovn\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.056387 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-log-ovn\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.056477 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-etc-ovs\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.056735 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-lib\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.057456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/187d55eb-db2f-4935-91cc-8ef51895a35a-scripts\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.058022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b03b506e-7150-4904-b58b-8e442885af50-scripts\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.059956 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-combined-ca-bundle\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.060413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-ovn-controller-tls-certs\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.077285 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwx8k\" (UniqueName: \"kubernetes.io/projected/b03b506e-7150-4904-b58b-8e442885af50-kube-api-access-hwx8k\") pod \"ovn-controller-666xn\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.077705 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/187d55eb-db2f-4935-91cc-8ef51895a35a-kube-api-access-pgk4j\") pod \"ovn-controller-ovs-tpch6\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.186194 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn" Mar 13 12:07:58 crc kubenswrapper[4786]: I0313 12:07:58.203541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.793866 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.795303 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.797569 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.797804 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.797872 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.797912 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4fxzl" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.809994 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881083 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881130 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881382 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881483 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.881523 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj7w6\" (UniqueName: \"kubernetes.io/projected/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-kube-api-access-dj7w6\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj7w6\" (UniqueName: \"kubernetes.io/projected/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-kube-api-access-dj7w6\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991783 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991869 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991918 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.991985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.992208 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.993484 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.993989 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:07:59 crc kubenswrapper[4786]: I0313 12:07:59.995665 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.002262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.005826 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.010495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.020629 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj7w6\" (UniqueName: \"kubernetes.io/projected/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-kube-api-access-dj7w6\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.020765 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.117939 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.135162 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556728-nddsg"] Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.136551 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.140343 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-nddsg"] Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.140785 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.141010 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.141204 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.194891 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6vf\" (UniqueName: \"kubernetes.io/projected/7ee76d39-03f4-4564-98f5-4903ea00568f-kube-api-access-8m6vf\") pod \"auto-csr-approver-29556728-nddsg\" (UID: \"7ee76d39-03f4-4564-98f5-4903ea00568f\") " pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.296718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6vf\" (UniqueName: \"kubernetes.io/projected/7ee76d39-03f4-4564-98f5-4903ea00568f-kube-api-access-8m6vf\") pod \"auto-csr-approver-29556728-nddsg\" (UID: \"7ee76d39-03f4-4564-98f5-4903ea00568f\") " pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.313151 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6vf\" (UniqueName: \"kubernetes.io/projected/7ee76d39-03f4-4564-98f5-4903ea00568f-kube-api-access-8m6vf\") pod \"auto-csr-approver-29556728-nddsg\" (UID: \"7ee76d39-03f4-4564-98f5-4903ea00568f\") " pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:00 crc kubenswrapper[4786]: I0313 12:08:00.454766 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:06 crc kubenswrapper[4786]: E0313 12:08:06.953811 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 12:08:06 crc kubenswrapper[4786]: E0313 12:08:06.954456 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ztlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-584n2_openstack(d92b379a-9d6d-4b24-b4c6-60af648de68b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:08:06 crc kubenswrapper[4786]: E0313 12:08:06.956254 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-584n2" podUID="d92b379a-9d6d-4b24-b4c6-60af648de68b" Mar 13 12:08:06 crc kubenswrapper[4786]: E0313 12:08:06.960004 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 12:08:06 crc kubenswrapper[4786]: E0313 12:08:06.960126 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ppxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-854f47b4f9-9tdtc_openstack(3971b063-0530-44cd-9912-8c6c90128766): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:08:06 crc kubenswrapper[4786]: E0313 12:08:06.961291 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" podUID="3971b063-0530-44cd-9912-8c6c90128766" Mar 13 12:08:07 crc kubenswrapper[4786]: E0313 12:08:07.179274 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" podUID="3971b063-0530-44cd-9912-8c6c90128766" Mar 13 12:08:08 crc kubenswrapper[4786]: I0313 12:08:08.169118 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:08:08 crc kubenswrapper[4786]: I0313 12:08:08.169185 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:08:08 crc kubenswrapper[4786]: I0313 12:08:08.169227 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:08:08 crc kubenswrapper[4786]: I0313 12:08:08.169836 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5656b6c6cc644913041fc5892205e2cc6f507fb238f0bcbc7956307710968e91"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:08:08 crc kubenswrapper[4786]: I0313 12:08:08.169877 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://5656b6c6cc644913041fc5892205e2cc6f507fb238f0bcbc7956307710968e91" gracePeriod=600 Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.769033 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.769480 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7nxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(258afae9-f870-4f49-8102-3f987302da26): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.770688 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="258afae9-f870-4f49-8102-3f987302da26" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.792519 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.792709 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fklln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-84lsl_openstack(7440d8d4-0456-48ac-a688-a605c8e0c482): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.793904 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" podUID="7440d8d4-0456-48ac-a688-a605c8e0c482" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.798565 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.798735 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-drrl5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(53fea24b-7ca8-4c0a-96d1-458ca1e877a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.800049 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.803528 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.803860 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kd6wg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-rfc95_openstack(4145a870-de38-47e9-a132-56500eb117fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:08:08 crc kubenswrapper[4786]: E0313 12:08:08.805108 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" podUID="4145a870-de38-47e9-a132-56500eb117fc" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.193497 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="5656b6c6cc644913041fc5892205e2cc6f507fb238f0bcbc7956307710968e91" exitCode=0 Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.193555 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"5656b6c6cc644913041fc5892205e2cc6f507fb238f0bcbc7956307710968e91"} Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.193610 4786 scope.go:117] "RemoveContainer" containerID="cee9aff52905686331ac0d49b868be713596890b00b0633ce66e8cdee6b5f0de" Mar 13 12:08:09 crc kubenswrapper[4786]: E0313 12:08:09.195651 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95\\\"\"" pod="openstack/openstack-galera-0" podUID="258afae9-f870-4f49-8102-3f987302da26" Mar 13 12:08:09 crc kubenswrapper[4786]: E0313 12:08:09.198585 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" podUID="4145a870-de38-47e9-a132-56500eb117fc" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.588445 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.608110 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651178 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-config\") pod \"d92b379a-9d6d-4b24-b4c6-60af648de68b\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651301 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-dns-svc\") pod \"d92b379a-9d6d-4b24-b4c6-60af648de68b\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651351 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fklln\" (UniqueName: \"kubernetes.io/projected/7440d8d4-0456-48ac-a688-a605c8e0c482-kube-api-access-fklln\") pod \"7440d8d4-0456-48ac-a688-a605c8e0c482\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ztlj\" (UniqueName: \"kubernetes.io/projected/d92b379a-9d6d-4b24-b4c6-60af648de68b-kube-api-access-5ztlj\") pod \"d92b379a-9d6d-4b24-b4c6-60af648de68b\" (UID: \"d92b379a-9d6d-4b24-b4c6-60af648de68b\") " Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651449 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7440d8d4-0456-48ac-a688-a605c8e0c482-config\") pod \"7440d8d4-0456-48ac-a688-a605c8e0c482\" (UID: \"7440d8d4-0456-48ac-a688-a605c8e0c482\") " Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651825 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-config" (OuterVolumeSpecName: "config") pod "d92b379a-9d6d-4b24-b4c6-60af648de68b" (UID: "d92b379a-9d6d-4b24-b4c6-60af648de68b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.651998 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.652319 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7440d8d4-0456-48ac-a688-a605c8e0c482-config" (OuterVolumeSpecName: "config") pod "7440d8d4-0456-48ac-a688-a605c8e0c482" (UID: "7440d8d4-0456-48ac-a688-a605c8e0c482"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.655157 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92b379a-9d6d-4b24-b4c6-60af648de68b-kube-api-access-5ztlj" (OuterVolumeSpecName: "kube-api-access-5ztlj") pod "d92b379a-9d6d-4b24-b4c6-60af648de68b" (UID: "d92b379a-9d6d-4b24-b4c6-60af648de68b"). InnerVolumeSpecName "kube-api-access-5ztlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.656076 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d92b379a-9d6d-4b24-b4c6-60af648de68b" (UID: "d92b379a-9d6d-4b24-b4c6-60af648de68b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.675444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7440d8d4-0456-48ac-a688-a605c8e0c482-kube-api-access-fklln" (OuterVolumeSpecName: "kube-api-access-fklln") pod "7440d8d4-0456-48ac-a688-a605c8e0c482" (UID: "7440d8d4-0456-48ac-a688-a605c8e0c482"). InnerVolumeSpecName "kube-api-access-fklln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.752963 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92b379a-9d6d-4b24-b4c6-60af648de68b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.753298 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fklln\" (UniqueName: \"kubernetes.io/projected/7440d8d4-0456-48ac-a688-a605c8e0c482-kube-api-access-fklln\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.753309 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ztlj\" (UniqueName: \"kubernetes.io/projected/d92b379a-9d6d-4b24-b4c6-60af648de68b-kube-api-access-5ztlj\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.753317 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7440d8d4-0456-48ac-a688-a605c8e0c482-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:09 crc kubenswrapper[4786]: I0313 12:08:09.936006 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.117543 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.236717 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e4f5bdf5-c352-4722-bcbd-704965ab36f0","Type":"ContainerStarted","Data":"f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.236989 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.238042 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a37be46-7b90-4c56-8dcf-a3ea45123df8","Type":"ContainerStarted","Data":"324e06768cec01cf06404efb519d916ff9eb0647dda54f75c85a98192979ac4b"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.239360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-584n2" event={"ID":"d92b379a-9d6d-4b24-b4c6-60af648de68b","Type":"ContainerDied","Data":"4b1c3471ad562997d253ba3c135960ae8a9508cc09b894744cb917ab5d648eeb"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.239409 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-584n2" Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.241767 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" event={"ID":"7440d8d4-0456-48ac-a688-a605c8e0c482","Type":"ContainerDied","Data":"aca74cefabbbd5f75584ba7a0b363c26fe481e7dfbc356cadbef59cbf5a5ba0c"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.241814 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-84lsl" Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.244141 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b28a544-1e5f-46f0-a6d9-7a147c5d737e","Type":"ContainerStarted","Data":"fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.249114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"1e7dfa6aebd4ca8695c470b1c4f1a2306b0f2eefc624c2d634686a5a8cd4e40b"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.250760 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"befbdb06-def2-49f6-83c8-c3a84dd09334","Type":"ContainerStarted","Data":"86dd67ae5f5f05a266da244184a9dfebc8ad3556c3723bc5a97c5eeadc46bd53"} Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.265914 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.8343727269999999 podStartE2EDuration="20.265896796s" podCreationTimestamp="2026-03-13 12:07:50 +0000 UTC" firstStartedPulling="2026-03-13 12:07:51.173865334 +0000 UTC m=+1258.453518781" lastFinishedPulling="2026-03-13 12:08:09.605389403 +0000 UTC m=+1276.885042850" observedRunningTime="2026-03-13 12:08:10.253921738 +0000 UTC m=+1277.533575195" watchObservedRunningTime="2026-03-13 12:08:10.265896796 +0000 UTC m=+1277.545550243" Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.389031 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.404781 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-84lsl"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.420031 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-84lsl"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.448165 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-584n2"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.457981 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-584n2"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.510363 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tpch6"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.570262 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-nddsg"] Mar 13 12:08:10 crc kubenswrapper[4786]: I0313 12:08:10.655535 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:08:10 crc kubenswrapper[4786]: W0313 12:08:10.845124 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e0cbbc8_b706_4a93_bd1b_442a68cce24b.slice/crio-d62cac7c78b0b8b9b878871534cc2e4db451db4a8e9e1afed2f546809e636f0e WatchSource:0}: Error finding container d62cac7c78b0b8b9b878871534cc2e4db451db4a8e9e1afed2f546809e636f0e: Status 404 returned error can't find the container with id d62cac7c78b0b8b9b878871534cc2e4db451db4a8e9e1afed2f546809e636f0e Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.261587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn" event={"ID":"b03b506e-7150-4904-b58b-8e442885af50","Type":"ContainerStarted","Data":"30776095502a0329caf38ce2f83b6444380f242d9cdfaa99776a8ae91524b0b4"} Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.265528 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerStarted","Data":"051b19402dfadd324f40642cd77d4b6f14e07391e2fd2b7bc31fc5f9090c7ee7"} Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.267420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b196d91-2a1f-4ee5-81d5-0133f2917cc5","Type":"ContainerStarted","Data":"aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d"} Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.269498 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-nddsg" event={"ID":"7ee76d39-03f4-4564-98f5-4903ea00568f","Type":"ContainerStarted","Data":"a5a79d430a65960e8147f45312dde3687f23191e65e608dcdaf6cba660ef9b38"} Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.270796 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e0cbbc8-b706-4a93-bd1b-442a68cce24b","Type":"ContainerStarted","Data":"d62cac7c78b0b8b9b878871534cc2e4db451db4a8e9e1afed2f546809e636f0e"} Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.273482 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53fea24b-7ca8-4c0a-96d1-458ca1e877a7","Type":"ContainerStarted","Data":"27d6eb8401490fb55d774c4f395089b4fb75b0cc2244cbaf43b5759b74129ca6"} Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.453008 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7440d8d4-0456-48ac-a688-a605c8e0c482" path="/var/lib/kubelet/pods/7440d8d4-0456-48ac-a688-a605c8e0c482/volumes" Mar 13 12:08:11 crc kubenswrapper[4786]: I0313 12:08:11.454150 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92b379a-9d6d-4b24-b4c6-60af648de68b" path="/var/lib/kubelet/pods/d92b379a-9d6d-4b24-b4c6-60af648de68b/volumes" Mar 13 12:08:14 crc kubenswrapper[4786]: I0313 12:08:14.294755 4786 generic.go:334] "Generic (PLEG): container finished" podID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerID="fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8" exitCode=0 Mar 13 12:08:14 crc kubenswrapper[4786]: I0313 12:08:14.295000 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b28a544-1e5f-46f0-a6d9-7a147c5d737e","Type":"ContainerDied","Data":"fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.302740 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"befbdb06-def2-49f6-83c8-c3a84dd09334","Type":"ContainerStarted","Data":"f8f7253cf1f9e788176a413b643a024cd5a1da997bb15d8a92fc01baed2f8c02"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.303341 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.304240 4786 generic.go:334] "Generic (PLEG): container finished" podID="7ee76d39-03f4-4564-98f5-4903ea00568f" containerID="30476293489f29d0bf28a9f340cd844ff67d79f12b2a58ed298bb9282c465e69" exitCode=0 Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.304317 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-nddsg" event={"ID":"7ee76d39-03f4-4564-98f5-4903ea00568f","Type":"ContainerDied","Data":"30476293489f29d0bf28a9f340cd844ff67d79f12b2a58ed298bb9282c465e69"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.305775 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e0cbbc8-b706-4a93-bd1b-442a68cce24b","Type":"ContainerStarted","Data":"cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.307526 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a37be46-7b90-4c56-8dcf-a3ea45123df8","Type":"ContainerStarted","Data":"9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.309191 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn" event={"ID":"b03b506e-7150-4904-b58b-8e442885af50","Type":"ContainerStarted","Data":"15a58925a001b150b8aba5de1a05d26b0e8b136642a71e1d37e08618e19f5026"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.309307 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-666xn" Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.310743 4786 generic.go:334] "Generic (PLEG): container finished" podID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerID="f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34" exitCode=0 Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.310786 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerDied","Data":"f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.313439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b28a544-1e5f-46f0-a6d9-7a147c5d737e","Type":"ContainerStarted","Data":"ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7"} Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.322945 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.71728126 podStartE2EDuration="23.322928435s" podCreationTimestamp="2026-03-13 12:07:52 +0000 UTC" firstStartedPulling="2026-03-13 12:08:09.981632725 +0000 UTC m=+1277.261286172" lastFinishedPulling="2026-03-13 12:08:14.5872799 +0000 UTC m=+1281.866933347" observedRunningTime="2026-03-13 12:08:15.319467072 +0000 UTC m=+1282.599120509" watchObservedRunningTime="2026-03-13 12:08:15.322928435 +0000 UTC m=+1282.602581902" Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.351544 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-666xn" podStartSLOduration=14.146170582 podStartE2EDuration="18.351523936s" podCreationTimestamp="2026-03-13 12:07:57 +0000 UTC" firstStartedPulling="2026-03-13 12:08:10.390469794 +0000 UTC m=+1277.670123241" lastFinishedPulling="2026-03-13 12:08:14.595823148 +0000 UTC m=+1281.875476595" observedRunningTime="2026-03-13 12:08:15.342863805 +0000 UTC m=+1282.622517252" watchObservedRunningTime="2026-03-13 12:08:15.351523936 +0000 UTC m=+1282.631177383" Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.408967 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.847212829 podStartE2EDuration="27.408948516s" podCreationTimestamp="2026-03-13 12:07:48 +0000 UTC" firstStartedPulling="2026-03-13 12:07:51.051475914 +0000 UTC m=+1258.331129361" lastFinishedPulling="2026-03-13 12:08:09.613211601 +0000 UTC m=+1276.892865048" observedRunningTime="2026-03-13 12:08:15.380747644 +0000 UTC m=+1282.660401091" watchObservedRunningTime="2026-03-13 12:08:15.408948516 +0000 UTC m=+1282.688601973" Mar 13 12:08:15 crc kubenswrapper[4786]: I0313 12:08:15.577984 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.330385 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerStarted","Data":"57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3"} Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.330691 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerStarted","Data":"7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd"} Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.353870 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tpch6" podStartSLOduration=15.298551126 podStartE2EDuration="19.353852834s" podCreationTimestamp="2026-03-13 12:07:57 +0000 UTC" firstStartedPulling="2026-03-13 12:08:10.535025914 +0000 UTC m=+1277.814679351" lastFinishedPulling="2026-03-13 12:08:14.590327592 +0000 UTC m=+1281.869981059" observedRunningTime="2026-03-13 12:08:16.352358374 +0000 UTC m=+1283.632011831" watchObservedRunningTime="2026-03-13 12:08:16.353852834 +0000 UTC m=+1283.633506301" Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.677591 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.725926 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m6vf\" (UniqueName: \"kubernetes.io/projected/7ee76d39-03f4-4564-98f5-4903ea00568f-kube-api-access-8m6vf\") pod \"7ee76d39-03f4-4564-98f5-4903ea00568f\" (UID: \"7ee76d39-03f4-4564-98f5-4903ea00568f\") " Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.731715 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee76d39-03f4-4564-98f5-4903ea00568f-kube-api-access-8m6vf" (OuterVolumeSpecName: "kube-api-access-8m6vf") pod "7ee76d39-03f4-4564-98f5-4903ea00568f" (UID: "7ee76d39-03f4-4564-98f5-4903ea00568f"). InnerVolumeSpecName "kube-api-access-8m6vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:16 crc kubenswrapper[4786]: I0313 12:08:16.829137 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m6vf\" (UniqueName: \"kubernetes.io/projected/7ee76d39-03f4-4564-98f5-4903ea00568f-kube-api-access-8m6vf\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.346988 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-nddsg" Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.346987 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-nddsg" event={"ID":"7ee76d39-03f4-4564-98f5-4903ea00568f","Type":"ContainerDied","Data":"a5a79d430a65960e8147f45312dde3687f23191e65e608dcdaf6cba660ef9b38"} Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.347764 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a79d430a65960e8147f45312dde3687f23191e65e608dcdaf6cba660ef9b38" Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.347785 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.347797 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.749384 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-4dhwg"] Mar 13 12:08:17 crc kubenswrapper[4786]: I0313 12:08:17.753974 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-4dhwg"] Mar 13 12:08:17 crc kubenswrapper[4786]: E0313 12:08:17.957013 4786 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:49290->38.102.83.151:42535: write tcp 38.102.83.151:49290->38.102.83.151:42535: write: broken pipe Mar 13 12:08:19 crc kubenswrapper[4786]: I0313 12:08:19.449463 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af93e5fc-a1db-44c4-aecb-db9648c603ab" path="/var/lib/kubelet/pods/af93e5fc-a1db-44c4-aecb-db9648c603ab/volumes" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.374727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e0cbbc8-b706-4a93-bd1b-442a68cce24b","Type":"ContainerStarted","Data":"6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b"} Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.376782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a37be46-7b90-4c56-8dcf-a3ea45123df8","Type":"ContainerStarted","Data":"75127e67378a2a6d7c4e145c4a096adfcfcfcab00c002649d437ac56addedfda"} Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.405260 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.98774598 podStartE2EDuration="22.405234516s" podCreationTimestamp="2026-03-13 12:07:58 +0000 UTC" firstStartedPulling="2026-03-13 12:08:10.851813753 +0000 UTC m=+1278.131467200" lastFinishedPulling="2026-03-13 12:08:19.269302289 +0000 UTC m=+1286.548955736" observedRunningTime="2026-03-13 12:08:20.398868886 +0000 UTC m=+1287.678522403" watchObservedRunningTime="2026-03-13 12:08:20.405234516 +0000 UTC m=+1287.684887973" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.438476 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.284641531 podStartE2EDuration="25.43845056s" podCreationTimestamp="2026-03-13 12:07:55 +0000 UTC" firstStartedPulling="2026-03-13 12:08:10.13499687 +0000 UTC m=+1277.414650317" lastFinishedPulling="2026-03-13 12:08:19.288805899 +0000 UTC m=+1286.568459346" observedRunningTime="2026-03-13 12:08:20.431689131 +0000 UTC m=+1287.711342598" watchObservedRunningTime="2026-03-13 12:08:20.43845056 +0000 UTC m=+1287.718104077" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.440946 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.471063 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.471141 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.482938 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 12:08:20 crc kubenswrapper[4786]: I0313 12:08:20.565340 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.118586 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.356511 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.385184 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.385228 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.425997 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.430182 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.498153 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.618981 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-rfc95"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.668005 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-kbjq9"] Mar 13 12:08:21 crc kubenswrapper[4786]: E0313 12:08:21.668337 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee76d39-03f4-4564-98f5-4903ea00568f" containerName="oc" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.668354 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee76d39-03f4-4564-98f5-4903ea00568f" containerName="oc" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.668533 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee76d39-03f4-4564-98f5-4903ea00568f" containerName="oc" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.669301 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.672225 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.681084 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-kbjq9"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.813963 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.818371 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.827660 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.827858 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.828073 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ldpcr" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.828214 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.833326 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mpvpt"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.834322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.837675 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.848302 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-config\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.848345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.852624 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.855374 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-dns-svc\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.856835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5qh\" (UniqueName: \"kubernetes.io/projected/0aa4eda0-a00d-4a78-8378-9baea6549157-kube-api-access-jj5qh\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.891169 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-9tdtc"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.926700 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mpvpt"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.958002 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-brfln"] Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959247 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e05d002-d224-4a13-8497-fc49712f7084-config\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959310 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959319 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-combined-ca-bundle\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959350 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-dns-svc\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959382 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-config\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959405 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959433 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovn-rundir\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5rl\" (UniqueName: \"kubernetes.io/projected/5e05d002-d224-4a13-8497-fc49712f7084-kube-api-access-9v5rl\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959518 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovs-rundir\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5qh\" (UniqueName: \"kubernetes.io/projected/0aa4eda0-a00d-4a78-8378-9baea6549157-kube-api-access-jj5qh\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959564 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959622 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959658 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959700 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-config\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.959724 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhfj\" (UniqueName: \"kubernetes.io/projected/2b25a4cb-7b76-4863-9085-67f99d81f569-kube-api-access-bjhfj\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.960093 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.960118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-scripts\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.960929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-config\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.961128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.961782 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-dns-svc\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.969474 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 12:08:21 crc kubenswrapper[4786]: I0313 12:08:21.978901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-brfln"] Mar 13 12:08:21 crc kubenswrapper[4786]: E0313 12:08:21.986629 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4145a870_de38_47e9_a132_56500eb117fc.slice/crio-conmon-3fe1b218c0311cea8d74667897f5bad03685a4e22895a4dfd85265b2994b2eba.scope\": RecentStats: unable to find data in memory cache]" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.000112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5qh\" (UniqueName: \"kubernetes.io/projected/0aa4eda0-a00d-4a78-8378-9baea6549157-kube-api-access-jj5qh\") pod \"dnsmasq-dns-7988f9db49-kbjq9\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065245 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-config\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065309 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h52gx\" (UniqueName: \"kubernetes.io/projected/84d04c9e-4fd8-4b66-9481-bc2f3c774887-kube-api-access-h52gx\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065342 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065372 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e05d002-d224-4a13-8497-fc49712f7084-config\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065409 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065450 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-combined-ca-bundle\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-config\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065505 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065530 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovn-rundir\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065583 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5rl\" (UniqueName: \"kubernetes.io/projected/5e05d002-d224-4a13-8497-fc49712f7084-kube-api-access-9v5rl\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065603 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovs-rundir\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065625 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065641 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065716 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhfj\" (UniqueName: \"kubernetes.io/projected/2b25a4cb-7b76-4863-9085-67f99d81f569-kube-api-access-bjhfj\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.065760 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-scripts\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.066087 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovn-rundir\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.066255 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovs-rundir\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.066551 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.066861 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-scripts\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.067164 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-config\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.068633 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e05d002-d224-4a13-8497-fc49712f7084-config\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.070845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.071891 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-combined-ca-bundle\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.071950 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.072420 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.075416 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.085087 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5rl\" (UniqueName: \"kubernetes.io/projected/5e05d002-d224-4a13-8497-fc49712f7084-kube-api-access-9v5rl\") pod \"ovn-controller-metrics-mpvpt\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.086572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhfj\" (UniqueName: \"kubernetes.io/projected/2b25a4cb-7b76-4863-9085-67f99d81f569-kube-api-access-bjhfj\") pod \"ovn-northd-0\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.167003 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.167077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-config\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.167112 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h52gx\" (UniqueName: \"kubernetes.io/projected/84d04c9e-4fd8-4b66-9481-bc2f3c774887-kube-api-access-h52gx\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.167135 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.167165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.168085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.168366 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-config\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.168872 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.169674 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.175470 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.188079 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.189509 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h52gx\" (UniqueName: \"kubernetes.io/projected/84d04c9e-4fd8-4b66-9481-bc2f3c774887-kube-api-access-h52gx\") pod \"dnsmasq-dns-5d944d7b75-brfln\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.261708 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.267572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-config\") pod \"3971b063-0530-44cd-9912-8c6c90128766\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.267637 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-dns-svc\") pod \"3971b063-0530-44cd-9912-8c6c90128766\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.267755 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ppxz\" (UniqueName: \"kubernetes.io/projected/3971b063-0530-44cd-9912-8c6c90128766-kube-api-access-5ppxz\") pod \"3971b063-0530-44cd-9912-8c6c90128766\" (UID: \"3971b063-0530-44cd-9912-8c6c90128766\") " Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.268146 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-config" (OuterVolumeSpecName: "config") pod "3971b063-0530-44cd-9912-8c6c90128766" (UID: "3971b063-0530-44cd-9912-8c6c90128766"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.268208 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3971b063-0530-44cd-9912-8c6c90128766" (UID: "3971b063-0530-44cd-9912-8c6c90128766"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.270935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3971b063-0530-44cd-9912-8c6c90128766-kube-api-access-5ppxz" (OuterVolumeSpecName: "kube-api-access-5ppxz") pod "3971b063-0530-44cd-9912-8c6c90128766" (UID: "3971b063-0530-44cd-9912-8c6c90128766"). InnerVolumeSpecName "kube-api-access-5ppxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.287521 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.291269 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.369181 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ppxz\" (UniqueName: \"kubernetes.io/projected/3971b063-0530-44cd-9912-8c6c90128766-kube-api-access-5ppxz\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.369223 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.369236 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3971b063-0530-44cd-9912-8c6c90128766-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.396121 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"258afae9-f870-4f49-8102-3f987302da26","Type":"ContainerStarted","Data":"22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909"} Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.397697 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" event={"ID":"3971b063-0530-44cd-9912-8c6c90128766","Type":"ContainerDied","Data":"e388c6109d1020d03e655ae34cdb5732e3123dbf73574efbed9e7aa1ff72e624"} Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.397753 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-9tdtc" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.406172 4786 generic.go:334] "Generic (PLEG): container finished" podID="4145a870-de38-47e9-a132-56500eb117fc" containerID="3fe1b218c0311cea8d74667897f5bad03685a4e22895a4dfd85265b2994b2eba" exitCode=0 Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.406381 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" event={"ID":"4145a870-de38-47e9-a132-56500eb117fc","Type":"ContainerDied","Data":"3fe1b218c0311cea8d74667897f5bad03685a4e22895a4dfd85265b2994b2eba"} Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.553935 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-9tdtc"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.570761 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-9tdtc"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.628799 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.712206 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-brfln"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.738937 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-f5l28"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.740296 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.756086 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.776214 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-f5l28"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.782533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.782585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.782641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpx7k\" (UniqueName: \"kubernetes.io/projected/5369266d-f8f8-4667-a8dd-0f316e959fc0-kube-api-access-cpx7k\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.782683 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-config\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.782706 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.806495 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mpvpt"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.884567 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpx7k\" (UniqueName: \"kubernetes.io/projected/5369266d-f8f8-4667-a8dd-0f316e959fc0-kube-api-access-cpx7k\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.884662 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-config\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.884693 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.884760 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.884841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.887274 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.888322 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-config\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.893116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.893693 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.909057 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpx7k\" (UniqueName: \"kubernetes.io/projected/5369266d-f8f8-4667-a8dd-0f316e959fc0-kube-api-access-cpx7k\") pod \"dnsmasq-dns-7b9fd7d84c-f5l28\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.939554 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.966380 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-brfln"] Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.986261 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd6wg\" (UniqueName: \"kubernetes.io/projected/4145a870-de38-47e9-a132-56500eb117fc-kube-api-access-kd6wg\") pod \"4145a870-de38-47e9-a132-56500eb117fc\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.986396 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-config\") pod \"4145a870-de38-47e9-a132-56500eb117fc\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " Mar 13 12:08:22 crc kubenswrapper[4786]: I0313 12:08:22.986439 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-dns-svc\") pod \"4145a870-de38-47e9-a132-56500eb117fc\" (UID: \"4145a870-de38-47e9-a132-56500eb117fc\") " Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.005915 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4145a870-de38-47e9-a132-56500eb117fc-kube-api-access-kd6wg" (OuterVolumeSpecName: "kube-api-access-kd6wg") pod "4145a870-de38-47e9-a132-56500eb117fc" (UID: "4145a870-de38-47e9-a132-56500eb117fc"). InnerVolumeSpecName "kube-api-access-kd6wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:23 crc kubenswrapper[4786]: W0313 12:08:23.011347 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aa4eda0_a00d_4a78_8378_9baea6549157.slice/crio-ba1b27812aece12e4c51839c0f045262d8de230f85db243aa454dc5113c03df8 WatchSource:0}: Error finding container ba1b27812aece12e4c51839c0f045262d8de230f85db243aa454dc5113c03df8: Status 404 returned error can't find the container with id ba1b27812aece12e4c51839c0f045262d8de230f85db243aa454dc5113c03df8 Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.020632 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-kbjq9"] Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.025484 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-config" (OuterVolumeSpecName: "config") pod "4145a870-de38-47e9-a132-56500eb117fc" (UID: "4145a870-de38-47e9-a132-56500eb117fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.053534 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4145a870-de38-47e9-a132-56500eb117fc" (UID: "4145a870-de38-47e9-a132-56500eb117fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.068229 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.088065 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.088101 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd6wg\" (UniqueName: \"kubernetes.io/projected/4145a870-de38-47e9-a132-56500eb117fc-kube-api-access-kd6wg\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.088116 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4145a870-de38-47e9-a132-56500eb117fc-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.367603 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-f5l28"] Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.468766 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3971b063-0530-44cd-9912-8c6c90128766" path="/var/lib/kubelet/pods/3971b063-0530-44cd-9912-8c6c90128766/volumes" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.482195 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" event={"ID":"4145a870-de38-47e9-a132-56500eb117fc","Type":"ContainerDied","Data":"8cdf91bd739667f130d2af8574f6b7722a5d834aea6b05f72e65c80ceeb83614"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.482250 4786 scope.go:117] "RemoveContainer" containerID="3fe1b218c0311cea8d74667897f5bad03685a4e22895a4dfd85265b2994b2eba" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.482384 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-rfc95" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.499067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b25a4cb-7b76-4863-9085-67f99d81f569","Type":"ContainerStarted","Data":"c4b3914edbaf72edb7abfa866ecf88fb7934c0dc501c7ee79e9e90bf44838b9f"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.515756 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mpvpt" event={"ID":"5e05d002-d224-4a13-8497-fc49712f7084","Type":"ContainerStarted","Data":"e7effd0a662e6015905b4c8a787a7e2eab9bffc913dcf3ece7c373f8bf960414"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.515799 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mpvpt" event={"ID":"5e05d002-d224-4a13-8497-fc49712f7084","Type":"ContainerStarted","Data":"01a15da7ad3738f7bd7a017441463cf4c03ff825c19e9088268a352896105eb2"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.529132 4786 generic.go:334] "Generic (PLEG): container finished" podID="84d04c9e-4fd8-4b66-9481-bc2f3c774887" containerID="934afd95f50bea854d57a332a60580c311a92539f6712f86f67c8e026b376ecf" exitCode=0 Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.529246 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" event={"ID":"84d04c9e-4fd8-4b66-9481-bc2f3c774887","Type":"ContainerDied","Data":"934afd95f50bea854d57a332a60580c311a92539f6712f86f67c8e026b376ecf"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.529279 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" event={"ID":"84d04c9e-4fd8-4b66-9481-bc2f3c774887","Type":"ContainerStarted","Data":"08b3a5964d2a3aa3d54e65bf04aa7a9fac4fa35617990bbbb03b57b4ce05e2df"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.559320 4786 generic.go:334] "Generic (PLEG): container finished" podID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerID="8b35c8137b296f34d36d7b917eca3d536add1ef6c90871a4f4d31b5caaf4f34d" exitCode=0 Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.562997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" event={"ID":"0aa4eda0-a00d-4a78-8378-9baea6549157","Type":"ContainerDied","Data":"8b35c8137b296f34d36d7b917eca3d536add1ef6c90871a4f4d31b5caaf4f34d"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.563056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" event={"ID":"0aa4eda0-a00d-4a78-8378-9baea6549157","Type":"ContainerStarted","Data":"ba1b27812aece12e4c51839c0f045262d8de230f85db243aa454dc5113c03df8"} Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.731052 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mpvpt" podStartSLOduration=2.731033842 podStartE2EDuration="2.731033842s" podCreationTimestamp="2026-03-13 12:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:23.712015565 +0000 UTC m=+1290.991669022" watchObservedRunningTime="2026-03-13 12:08:23.731033842 +0000 UTC m=+1291.010687289" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.759187 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-rfc95"] Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.766702 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-rfc95"] Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.901610 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:08:23 crc kubenswrapper[4786]: E0313 12:08:23.902489 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4145a870-de38-47e9-a132-56500eb117fc" containerName="init" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.902577 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4145a870-de38-47e9-a132-56500eb117fc" containerName="init" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.902854 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4145a870-de38-47e9-a132-56500eb117fc" containerName="init" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.919829 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.921508 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.923529 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.923765 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.923909 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-md55j" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.923928 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 12:08:23 crc kubenswrapper[4786]: E0313 12:08:23.926821 4786 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 13 12:08:23 crc kubenswrapper[4786]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/0aa4eda0-a00d-4a78-8378-9baea6549157/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 12:08:23 crc kubenswrapper[4786]: > podSandboxID="ba1b27812aece12e4c51839c0f045262d8de230f85db243aa454dc5113c03df8" Mar 13 12:08:23 crc kubenswrapper[4786]: E0313 12:08:23.927028 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 12:08:23 crc kubenswrapper[4786]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h5c8h5fbh5ddh5c5h666hbch5f5h66fh68fh87hfdh699h84hcdh589h64dh76h9ch5cfh5f8h56bh89h67h5fh56bhf6h654h556hch9dh657q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jj5qh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7988f9db49-kbjq9_openstack(0aa4eda0-a00d-4a78-8378-9baea6549157): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/0aa4eda0-a00d-4a78-8378-9baea6549157/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 12:08:23 crc kubenswrapper[4786]: > logger="UnhandledError" Mar 13 12:08:23 crc kubenswrapper[4786]: E0313 12:08:23.928667 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/0aa4eda0-a00d-4a78-8378-9baea6549157/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" Mar 13 12:08:23 crc kubenswrapper[4786]: I0313 12:08:23.945497 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.031208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-cache\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.031247 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8fnd\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-kube-api-access-g8fnd\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.031267 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.031312 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acba774d-de43-4651-a5f0-95875154afad-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.031350 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.031394 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-lock\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132009 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-config\") pod \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132073 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-sb\") pod \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132104 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-dns-svc\") pod \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132250 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-nb\") pod \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132278 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h52gx\" (UniqueName: \"kubernetes.io/projected/84d04c9e-4fd8-4b66-9481-bc2f3c774887-kube-api-access-h52gx\") pod \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\" (UID: \"84d04c9e-4fd8-4b66-9481-bc2f3c774887\") " Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132539 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-lock\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-cache\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132605 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8fnd\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-kube-api-access-g8fnd\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132624 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132667 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acba774d-de43-4651-a5f0-95875154afad-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.132994 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.133192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-lock\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.133826 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-cache\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: E0313 12:08:24.133963 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:08:24 crc kubenswrapper[4786]: E0313 12:08:24.133988 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:08:24 crc kubenswrapper[4786]: E0313 12:08:24.134033 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift podName:acba774d-de43-4651-a5f0-95875154afad nodeName:}" failed. No retries permitted until 2026-03-13 12:08:24.634015176 +0000 UTC m=+1291.913668623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift") pod "swift-storage-0" (UID: "acba774d-de43-4651-a5f0-95875154afad") : configmap "swift-ring-files" not found Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.136632 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d04c9e-4fd8-4b66-9481-bc2f3c774887-kube-api-access-h52gx" (OuterVolumeSpecName: "kube-api-access-h52gx") pod "84d04c9e-4fd8-4b66-9481-bc2f3c774887" (UID: "84d04c9e-4fd8-4b66-9481-bc2f3c774887"). InnerVolumeSpecName "kube-api-access-h52gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.141712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acba774d-de43-4651-a5f0-95875154afad-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.151811 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8fnd\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-kube-api-access-g8fnd\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.156459 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-config" (OuterVolumeSpecName: "config") pod "84d04c9e-4fd8-4b66-9481-bc2f3c774887" (UID: "84d04c9e-4fd8-4b66-9481-bc2f3c774887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.156560 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84d04c9e-4fd8-4b66-9481-bc2f3c774887" (UID: "84d04c9e-4fd8-4b66-9481-bc2f3c774887"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.162170 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84d04c9e-4fd8-4b66-9481-bc2f3c774887" (UID: "84d04c9e-4fd8-4b66-9481-bc2f3c774887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.164362 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84d04c9e-4fd8-4b66-9481-bc2f3c774887" (UID: "84d04c9e-4fd8-4b66-9481-bc2f3c774887"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.173616 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.234786 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.234826 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.234840 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.234852 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d04c9e-4fd8-4b66-9481-bc2f3c774887-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.234866 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h52gx\" (UniqueName: \"kubernetes.io/projected/84d04c9e-4fd8-4b66-9481-bc2f3c774887-kube-api-access-h52gx\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.570527 4786 generic.go:334] "Generic (PLEG): container finished" podID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerID="76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542" exitCode=0 Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.570583 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" event={"ID":"5369266d-f8f8-4667-a8dd-0f316e959fc0","Type":"ContainerDied","Data":"76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542"} Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.570930 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" event={"ID":"5369266d-f8f8-4667-a8dd-0f316e959fc0","Type":"ContainerStarted","Data":"2ecf1a8cb1d55cf9fbf520fc70ccfda3d8c91d42d13c46d079cc7a3973484a5e"} Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.577106 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" event={"ID":"84d04c9e-4fd8-4b66-9481-bc2f3c774887","Type":"ContainerDied","Data":"08b3a5964d2a3aa3d54e65bf04aa7a9fac4fa35617990bbbb03b57b4ce05e2df"} Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.577176 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-brfln" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.577322 4786 scope.go:117] "RemoveContainer" containerID="934afd95f50bea854d57a332a60580c311a92539f6712f86f67c8e026b376ecf" Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.640907 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:24 crc kubenswrapper[4786]: E0313 12:08:24.641401 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:08:24 crc kubenswrapper[4786]: E0313 12:08:24.641428 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:08:24 crc kubenswrapper[4786]: E0313 12:08:24.641477 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift podName:acba774d-de43-4651-a5f0-95875154afad nodeName:}" failed. No retries permitted until 2026-03-13 12:08:25.641457811 +0000 UTC m=+1292.921111338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift") pod "swift-storage-0" (UID: "acba774d-de43-4651-a5f0-95875154afad") : configmap "swift-ring-files" not found Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.672858 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-brfln"] Mar 13 12:08:24 crc kubenswrapper[4786]: I0313 12:08:24.677996 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-brfln"] Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.451566 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4145a870-de38-47e9-a132-56500eb117fc" path="/var/lib/kubelet/pods/4145a870-de38-47e9-a132-56500eb117fc/volumes" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.452492 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d04c9e-4fd8-4b66-9481-bc2f3c774887" path="/var/lib/kubelet/pods/84d04c9e-4fd8-4b66-9481-bc2f3c774887/volumes" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.586337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b25a4cb-7b76-4863-9085-67f99d81f569","Type":"ContainerStarted","Data":"9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc"} Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.586388 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b25a4cb-7b76-4863-9085-67f99d81f569","Type":"ContainerStarted","Data":"73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244"} Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.587204 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.590244 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" event={"ID":"0aa4eda0-a00d-4a78-8378-9baea6549157","Type":"ContainerStarted","Data":"c55445a7cc8b5160a262eabf814182787d563716174bd06c69f6e163f1a1b3dd"} Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.590700 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.592897 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" event={"ID":"5369266d-f8f8-4667-a8dd-0f316e959fc0","Type":"ContainerStarted","Data":"edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b"} Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.593105 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.613787 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.8330310880000003 podStartE2EDuration="4.61377038s" podCreationTimestamp="2026-03-13 12:08:21 +0000 UTC" firstStartedPulling="2026-03-13 12:08:22.775743057 +0000 UTC m=+1290.055396504" lastFinishedPulling="2026-03-13 12:08:24.556482349 +0000 UTC m=+1291.836135796" observedRunningTime="2026-03-13 12:08:25.610073841 +0000 UTC m=+1292.889727298" watchObservedRunningTime="2026-03-13 12:08:25.61377038 +0000 UTC m=+1292.893423827" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.637778 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" podStartSLOduration=4.637755719 podStartE2EDuration="4.637755719s" podCreationTimestamp="2026-03-13 12:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:25.631116192 +0000 UTC m=+1292.910769649" watchObservedRunningTime="2026-03-13 12:08:25.637755719 +0000 UTC m=+1292.917409166" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.653761 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" podStartSLOduration=3.653742634 podStartE2EDuration="3.653742634s" podCreationTimestamp="2026-03-13 12:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:25.651215267 +0000 UTC m=+1292.930868734" watchObservedRunningTime="2026-03-13 12:08:25.653742634 +0000 UTC m=+1292.933396101" Mar 13 12:08:25 crc kubenswrapper[4786]: I0313 12:08:25.661115 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:25 crc kubenswrapper[4786]: E0313 12:08:25.661324 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:08:25 crc kubenswrapper[4786]: E0313 12:08:25.661360 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:08:25 crc kubenswrapper[4786]: E0313 12:08:25.661423 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift podName:acba774d-de43-4651-a5f0-95875154afad nodeName:}" failed. No retries permitted until 2026-03-13 12:08:27.661400669 +0000 UTC m=+1294.941054126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift") pod "swift-storage-0" (UID: "acba774d-de43-4651-a5f0-95875154afad") : configmap "swift-ring-files" not found Mar 13 12:08:26 crc kubenswrapper[4786]: I0313 12:08:26.603211 4786 generic.go:334] "Generic (PLEG): container finished" podID="258afae9-f870-4f49-8102-3f987302da26" containerID="22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909" exitCode=0 Mar 13 12:08:26 crc kubenswrapper[4786]: I0313 12:08:26.603332 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"258afae9-f870-4f49-8102-3f987302da26","Type":"ContainerDied","Data":"22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909"} Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.631029 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"258afae9-f870-4f49-8102-3f987302da26","Type":"ContainerStarted","Data":"71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f"} Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.679286 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371996.17551 podStartE2EDuration="40.679264466s" podCreationTimestamp="2026-03-13 12:07:47 +0000 UTC" firstStartedPulling="2026-03-13 12:07:49.386359013 +0000 UTC m=+1256.666012460" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:27.658510413 +0000 UTC m=+1294.938163900" watchObservedRunningTime="2026-03-13 12:08:27.679264466 +0000 UTC m=+1294.958917933" Mar 13 12:08:27 crc kubenswrapper[4786]: E0313 12:08:27.701664 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:08:27 crc kubenswrapper[4786]: E0313 12:08:27.701717 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:08:27 crc kubenswrapper[4786]: E0313 12:08:27.701813 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift podName:acba774d-de43-4651-a5f0-95875154afad nodeName:}" failed. No retries permitted until 2026-03-13 12:08:31.701780456 +0000 UTC m=+1298.981433953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift") pod "swift-storage-0" (UID: "acba774d-de43-4651-a5f0-95875154afad") : configmap "swift-ring-files" not found Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.702378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.793947 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-lrq7b"] Mar 13 12:08:27 crc kubenswrapper[4786]: E0313 12:08:27.794810 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d04c9e-4fd8-4b66-9481-bc2f3c774887" containerName="init" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.794831 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d04c9e-4fd8-4b66-9481-bc2f3c774887" containerName="init" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.795027 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d04c9e-4fd8-4b66-9481-bc2f3c774887" containerName="init" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.795581 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.798350 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.798564 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.800989 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.811680 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lrq7b"] Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.906997 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-ring-data-devices\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.907057 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4fd3434b-b358-4463-a081-511dd7a3469d-etc-swift\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.907173 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-swiftconf\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.907202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-dispersionconf\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.907767 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5cmq\" (UniqueName: \"kubernetes.io/projected/4fd3434b-b358-4463-a081-511dd7a3469d-kube-api-access-j5cmq\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.907917 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-scripts\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:27 crc kubenswrapper[4786]: I0313 12:08:27.908049 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-combined-ca-bundle\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009111 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-swiftconf\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009169 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-dispersionconf\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009240 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5cmq\" (UniqueName: \"kubernetes.io/projected/4fd3434b-b358-4463-a081-511dd7a3469d-kube-api-access-j5cmq\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009266 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-scripts\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-combined-ca-bundle\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009343 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-ring-data-devices\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009369 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4fd3434b-b358-4463-a081-511dd7a3469d-etc-swift\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.009944 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4fd3434b-b358-4463-a081-511dd7a3469d-etc-swift\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.010220 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-ring-data-devices\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.010449 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-scripts\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.014049 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-dispersionconf\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.022421 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-swiftconf\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.022599 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-combined-ca-bundle\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.030084 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5cmq\" (UniqueName: \"kubernetes.io/projected/4fd3434b-b358-4463-a081-511dd7a3469d-kube-api-access-j5cmq\") pod \"swift-ring-rebalance-lrq7b\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.116014 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:28 crc kubenswrapper[4786]: W0313 12:08:28.556404 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd3434b_b358_4463_a081_511dd7a3469d.slice/crio-3aeb023e741a7ee941e31e761e3d59989cb76e5e491edbbee89a4c98dda417f3 WatchSource:0}: Error finding container 3aeb023e741a7ee941e31e761e3d59989cb76e5e491edbbee89a4c98dda417f3: Status 404 returned error can't find the container with id 3aeb023e741a7ee941e31e761e3d59989cb76e5e491edbbee89a4c98dda417f3 Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.564485 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lrq7b"] Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.639916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lrq7b" event={"ID":"4fd3434b-b358-4463-a081-511dd7a3469d","Type":"ContainerStarted","Data":"3aeb023e741a7ee941e31e761e3d59989cb76e5e491edbbee89a4c98dda417f3"} Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.849215 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.849269 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.913257 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j2d2d"] Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.915156 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.918737 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 12:08:28 crc kubenswrapper[4786]: I0313 12:08:28.922289 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j2d2d"] Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.025496 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf7f\" (UniqueName: \"kubernetes.io/projected/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-kube-api-access-2zf7f\") pod \"root-account-create-update-j2d2d\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.025601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-operator-scripts\") pod \"root-account-create-update-j2d2d\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.127563 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zf7f\" (UniqueName: \"kubernetes.io/projected/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-kube-api-access-2zf7f\") pod \"root-account-create-update-j2d2d\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.127646 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-operator-scripts\") pod \"root-account-create-update-j2d2d\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.128671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-operator-scripts\") pod \"root-account-create-update-j2d2d\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.155017 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zf7f\" (UniqueName: \"kubernetes.io/projected/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-kube-api-access-2zf7f\") pod \"root-account-create-update-j2d2d\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.249676 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:29 crc kubenswrapper[4786]: I0313 12:08:29.689377 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j2d2d"] Mar 13 12:08:30 crc kubenswrapper[4786]: I0313 12:08:30.656926 4786 generic.go:334] "Generic (PLEG): container finished" podID="564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" containerID="c29161d84a60eeddfde50252c6d91b69df94664233f5fe0d9fba02e17dc0a1ec" exitCode=0 Mar 13 12:08:30 crc kubenswrapper[4786]: I0313 12:08:30.656966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j2d2d" event={"ID":"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0","Type":"ContainerDied","Data":"c29161d84a60eeddfde50252c6d91b69df94664233f5fe0d9fba02e17dc0a1ec"} Mar 13 12:08:30 crc kubenswrapper[4786]: I0313 12:08:30.657204 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j2d2d" event={"ID":"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0","Type":"ContainerStarted","Data":"9d7fc70c40eab7f891487323d2a425d0e05f42442df7e9bf8a09bb7171536d04"} Mar 13 12:08:31 crc kubenswrapper[4786]: I0313 12:08:31.782390 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:31 crc kubenswrapper[4786]: E0313 12:08:31.782749 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:08:31 crc kubenswrapper[4786]: E0313 12:08:31.782771 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:08:31 crc kubenswrapper[4786]: E0313 12:08:31.782820 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift podName:acba774d-de43-4651-a5f0-95875154afad nodeName:}" failed. No retries permitted until 2026-03-13 12:08:39.782803677 +0000 UTC m=+1307.062457124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift") pod "swift-storage-0" (UID: "acba774d-de43-4651-a5f0-95875154afad") : configmap "swift-ring-files" not found Mar 13 12:08:31 crc kubenswrapper[4786]: E0313 12:08:31.938352 4786 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.151:35468->38.102.83.151:42535: read tcp 38.102.83.151:35468->38.102.83.151:42535: read: connection reset by peer Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.289050 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.470952 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.599724 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-operator-scripts\") pod \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.599922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zf7f\" (UniqueName: \"kubernetes.io/projected/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-kube-api-access-2zf7f\") pod \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\" (UID: \"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0\") " Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.600202 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" (UID: "564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.600335 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.606949 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-kube-api-access-2zf7f" (OuterVolumeSpecName: "kube-api-access-2zf7f") pod "564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" (UID: "564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0"). InnerVolumeSpecName "kube-api-access-2zf7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.674824 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j2d2d" event={"ID":"564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0","Type":"ContainerDied","Data":"9d7fc70c40eab7f891487323d2a425d0e05f42442df7e9bf8a09bb7171536d04"} Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.674865 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d7fc70c40eab7f891487323d2a425d0e05f42442df7e9bf8a09bb7171536d04" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.674937 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j2d2d" Mar 13 12:08:32 crc kubenswrapper[4786]: I0313 12:08:32.701985 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zf7f\" (UniqueName: \"kubernetes.io/projected/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0-kube-api-access-2zf7f\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:33 crc kubenswrapper[4786]: I0313 12:08:33.070933 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:08:33 crc kubenswrapper[4786]: I0313 12:08:33.183025 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-kbjq9"] Mar 13 12:08:33 crc kubenswrapper[4786]: I0313 12:08:33.183519 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerName="dnsmasq-dns" containerID="cri-o://c55445a7cc8b5160a262eabf814182787d563716174bd06c69f6e163f1a1b3dd" gracePeriod=10 Mar 13 12:08:33 crc kubenswrapper[4786]: I0313 12:08:33.684694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lrq7b" event={"ID":"4fd3434b-b358-4463-a081-511dd7a3469d","Type":"ContainerStarted","Data":"c47dabf207c6254379ea6b9544893c6b3c97efea63726763b4c7bf4d8a82c764"} Mar 13 12:08:33 crc kubenswrapper[4786]: I0313 12:08:33.726472 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-lrq7b" podStartSLOduration=2.568054015 podStartE2EDuration="6.726453018s" podCreationTimestamp="2026-03-13 12:08:27 +0000 UTC" firstStartedPulling="2026-03-13 12:08:28.55942044 +0000 UTC m=+1295.839073887" lastFinishedPulling="2026-03-13 12:08:32.717819443 +0000 UTC m=+1299.997472890" observedRunningTime="2026-03-13 12:08:33.7220024 +0000 UTC m=+1301.001655867" watchObservedRunningTime="2026-03-13 12:08:33.726453018 +0000 UTC m=+1301.006106465" Mar 13 12:08:34 crc kubenswrapper[4786]: I0313 12:08:34.697785 4786 generic.go:334] "Generic (PLEG): container finished" podID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerID="c55445a7cc8b5160a262eabf814182787d563716174bd06c69f6e163f1a1b3dd" exitCode=0 Mar 13 12:08:34 crc kubenswrapper[4786]: I0313 12:08:34.697866 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" event={"ID":"0aa4eda0-a00d-4a78-8378-9baea6549157","Type":"ContainerDied","Data":"c55445a7cc8b5160a262eabf814182787d563716174bd06c69f6e163f1a1b3dd"} Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.137020 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.241492 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-ovsdbserver-sb\") pod \"0aa4eda0-a00d-4a78-8378-9baea6549157\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.241603 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj5qh\" (UniqueName: \"kubernetes.io/projected/0aa4eda0-a00d-4a78-8378-9baea6549157-kube-api-access-jj5qh\") pod \"0aa4eda0-a00d-4a78-8378-9baea6549157\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.241631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-dns-svc\") pod \"0aa4eda0-a00d-4a78-8378-9baea6549157\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.241659 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-config\") pod \"0aa4eda0-a00d-4a78-8378-9baea6549157\" (UID: \"0aa4eda0-a00d-4a78-8378-9baea6549157\") " Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.248541 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa4eda0-a00d-4a78-8378-9baea6549157-kube-api-access-jj5qh" (OuterVolumeSpecName: "kube-api-access-jj5qh") pod "0aa4eda0-a00d-4a78-8378-9baea6549157" (UID: "0aa4eda0-a00d-4a78-8378-9baea6549157"). InnerVolumeSpecName "kube-api-access-jj5qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.279122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0aa4eda0-a00d-4a78-8378-9baea6549157" (UID: "0aa4eda0-a00d-4a78-8378-9baea6549157"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.295310 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-config" (OuterVolumeSpecName: "config") pod "0aa4eda0-a00d-4a78-8378-9baea6549157" (UID: "0aa4eda0-a00d-4a78-8378-9baea6549157"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.310404 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0aa4eda0-a00d-4a78-8378-9baea6549157" (UID: "0aa4eda0-a00d-4a78-8378-9baea6549157"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.343337 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.343392 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj5qh\" (UniqueName: \"kubernetes.io/projected/0aa4eda0-a00d-4a78-8378-9baea6549157-kube-api-access-jj5qh\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.343413 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.343430 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa4eda0-a00d-4a78-8378-9baea6549157-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.707138 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" event={"ID":"0aa4eda0-a00d-4a78-8378-9baea6549157","Type":"ContainerDied","Data":"ba1b27812aece12e4c51839c0f045262d8de230f85db243aa454dc5113c03df8"} Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.707200 4786 scope.go:117] "RemoveContainer" containerID="c55445a7cc8b5160a262eabf814182787d563716174bd06c69f6e163f1a1b3dd" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.707286 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-kbjq9" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.748954 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-kbjq9"] Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.754043 4786 scope.go:117] "RemoveContainer" containerID="8b35c8137b296f34d36d7b917eca3d536add1ef6c90871a4f4d31b5caaf4f34d" Mar 13 12:08:35 crc kubenswrapper[4786]: I0313 12:08:35.762909 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-kbjq9"] Mar 13 12:08:37 crc kubenswrapper[4786]: I0313 12:08:37.449320 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" path="/var/lib/kubelet/pods/0aa4eda0-a00d-4a78-8378-9baea6549157/volumes" Mar 13 12:08:37 crc kubenswrapper[4786]: I0313 12:08:37.450301 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 12:08:37 crc kubenswrapper[4786]: I0313 12:08:37.529356 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="galera" probeResult="failure" output=< Mar 13 12:08:37 crc kubenswrapper[4786]: wsrep_local_state_comment (Joined) differs from Synced Mar 13 12:08:37 crc kubenswrapper[4786]: > Mar 13 12:08:38 crc kubenswrapper[4786]: I0313 12:08:38.926701 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 12:08:39 crc kubenswrapper[4786]: I0313 12:08:39.824614 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:39 crc kubenswrapper[4786]: E0313 12:08:39.824801 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:08:39 crc kubenswrapper[4786]: E0313 12:08:39.824835 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:08:39 crc kubenswrapper[4786]: E0313 12:08:39.824923 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift podName:acba774d-de43-4651-a5f0-95875154afad nodeName:}" failed. No retries permitted until 2026-03-13 12:08:55.824901065 +0000 UTC m=+1323.104554522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift") pod "swift-storage-0" (UID: "acba774d-de43-4651-a5f0-95875154afad") : configmap "swift-ring-files" not found Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.497872 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bsjm4"] Mar 13 12:08:41 crc kubenswrapper[4786]: E0313 12:08:41.498451 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" containerName="mariadb-account-create-update" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.498465 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" containerName="mariadb-account-create-update" Mar 13 12:08:41 crc kubenswrapper[4786]: E0313 12:08:41.498488 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerName="init" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.498494 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerName="init" Mar 13 12:08:41 crc kubenswrapper[4786]: E0313 12:08:41.498513 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerName="dnsmasq-dns" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.498522 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerName="dnsmasq-dns" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.498714 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa4eda0-a00d-4a78-8378-9baea6549157" containerName="dnsmasq-dns" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.498730 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" containerName="mariadb-account-create-update" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.499388 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.507150 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bsjm4"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.603297 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5794-account-create-update-5vccb"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.604462 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.605757 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.616393 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5794-account-create-update-5vccb"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.656198 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87da50ad-41f1-4208-b36c-d874ac2250c7-operator-scripts\") pod \"keystone-db-create-bsjm4\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.656530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrkc8\" (UniqueName: \"kubernetes.io/projected/87da50ad-41f1-4208-b36c-d874ac2250c7-kube-api-access-lrkc8\") pod \"keystone-db-create-bsjm4\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.709119 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tjh4s"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.710256 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.720041 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tjh4s"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.758022 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87da50ad-41f1-4208-b36c-d874ac2250c7-operator-scripts\") pod \"keystone-db-create-bsjm4\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.758078 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc9776-7662-41d9-93ac-38c5b98709ab-operator-scripts\") pod \"keystone-5794-account-create-update-5vccb\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.758184 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsbkr\" (UniqueName: \"kubernetes.io/projected/ffbc9776-7662-41d9-93ac-38c5b98709ab-kube-api-access-nsbkr\") pod \"keystone-5794-account-create-update-5vccb\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.758214 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrkc8\" (UniqueName: \"kubernetes.io/projected/87da50ad-41f1-4208-b36c-d874ac2250c7-kube-api-access-lrkc8\") pod \"keystone-db-create-bsjm4\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.759018 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87da50ad-41f1-4208-b36c-d874ac2250c7-operator-scripts\") pod \"keystone-db-create-bsjm4\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.778212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrkc8\" (UniqueName: \"kubernetes.io/projected/87da50ad-41f1-4208-b36c-d874ac2250c7-kube-api-access-lrkc8\") pod \"keystone-db-create-bsjm4\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.804200 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d64c-account-create-update-zbfj6"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.805631 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.808801 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.814859 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d64c-account-create-update-zbfj6"] Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.817244 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.860024 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ce55b0-1082-49c8-b648-92425775ed24-operator-scripts\") pod \"placement-db-create-tjh4s\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.860234 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsbkr\" (UniqueName: \"kubernetes.io/projected/ffbc9776-7662-41d9-93ac-38c5b98709ab-kube-api-access-nsbkr\") pod \"keystone-5794-account-create-update-5vccb\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.860484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k555\" (UniqueName: \"kubernetes.io/projected/42ce55b0-1082-49c8-b648-92425775ed24-kube-api-access-6k555\") pod \"placement-db-create-tjh4s\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.860556 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc9776-7662-41d9-93ac-38c5b98709ab-operator-scripts\") pod \"keystone-5794-account-create-update-5vccb\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.861555 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc9776-7662-41d9-93ac-38c5b98709ab-operator-scripts\") pod \"keystone-5794-account-create-update-5vccb\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.881490 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsbkr\" (UniqueName: \"kubernetes.io/projected/ffbc9776-7662-41d9-93ac-38c5b98709ab-kube-api-access-nsbkr\") pod \"keystone-5794-account-create-update-5vccb\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.961577 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k555\" (UniqueName: \"kubernetes.io/projected/42ce55b0-1082-49c8-b648-92425775ed24-kube-api-access-6k555\") pod \"placement-db-create-tjh4s\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.961935 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-operator-scripts\") pod \"placement-d64c-account-create-update-zbfj6\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.961968 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2njk\" (UniqueName: \"kubernetes.io/projected/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-kube-api-access-f2njk\") pod \"placement-d64c-account-create-update-zbfj6\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.961998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ce55b0-1082-49c8-b648-92425775ed24-operator-scripts\") pod \"placement-db-create-tjh4s\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.962805 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ce55b0-1082-49c8-b648-92425775ed24-operator-scripts\") pod \"placement-db-create-tjh4s\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.975327 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:41 crc kubenswrapper[4786]: I0313 12:08:41.979068 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k555\" (UniqueName: \"kubernetes.io/projected/42ce55b0-1082-49c8-b648-92425775ed24-kube-api-access-6k555\") pod \"placement-db-create-tjh4s\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.024754 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.063763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-operator-scripts\") pod \"placement-d64c-account-create-update-zbfj6\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.063833 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2njk\" (UniqueName: \"kubernetes.io/projected/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-kube-api-access-f2njk\") pod \"placement-d64c-account-create-update-zbfj6\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.065408 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-operator-scripts\") pod \"placement-d64c-account-create-update-zbfj6\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.082388 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2njk\" (UniqueName: \"kubernetes.io/projected/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-kube-api-access-f2njk\") pod \"placement-d64c-account-create-update-zbfj6\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.126979 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.246697 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 12:08:42 crc kubenswrapper[4786]: I0313 12:08:42.262466 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bsjm4"] Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.406825 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5794-account-create-update-5vccb"] Mar 13 12:08:43 crc kubenswrapper[4786]: W0313 12:08:42.412127 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbc9776_7662_41d9_93ac_38c5b98709ab.slice/crio-00fa36bcf118f74dc28c36600ba5d001444cbb1074e7377be6593738bfa85e0a WatchSource:0}: Error finding container 00fa36bcf118f74dc28c36600ba5d001444cbb1074e7377be6593738bfa85e0a: Status 404 returned error can't find the container with id 00fa36bcf118f74dc28c36600ba5d001444cbb1074e7377be6593738bfa85e0a Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.522377 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tjh4s"] Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.770921 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tjh4s" event={"ID":"42ce55b0-1082-49c8-b648-92425775ed24","Type":"ContainerStarted","Data":"a95a3ffb3a7cfd95fc50a3707e38a7f96e07f8e41e18b54da33b225a494389a7"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.770961 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tjh4s" event={"ID":"42ce55b0-1082-49c8-b648-92425775ed24","Type":"ContainerStarted","Data":"934905257e954c1b207938a1681f406d83f6b1e1188d934bd85fc6eed09101b1"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.772110 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5794-account-create-update-5vccb" event={"ID":"ffbc9776-7662-41d9-93ac-38c5b98709ab","Type":"ContainerStarted","Data":"3e8ce702d9966b96d3a99192c9116bb2858d49fd20d91333e6186efe37b8fe1e"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.772149 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5794-account-create-update-5vccb" event={"ID":"ffbc9776-7662-41d9-93ac-38c5b98709ab","Type":"ContainerStarted","Data":"00fa36bcf118f74dc28c36600ba5d001444cbb1074e7377be6593738bfa85e0a"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.789413 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bsjm4" event={"ID":"87da50ad-41f1-4208-b36c-d874ac2250c7","Type":"ContainerStarted","Data":"a80da01ce4b12b8ab209d5616c1d6a41c1d72b0e70b32ddcb0d4f8b3a0f1fd69"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.789453 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bsjm4" event={"ID":"87da50ad-41f1-4208-b36c-d874ac2250c7","Type":"ContainerStarted","Data":"dabcb6fe3f7d3a3070f5b7d1fc4f34f064275b8528b6b0e026c3b81a3b025e3a"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.806844 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-tjh4s" podStartSLOduration=1.8068223209999998 podStartE2EDuration="1.806822321s" podCreationTimestamp="2026-03-13 12:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:42.803642737 +0000 UTC m=+1310.083296204" watchObservedRunningTime="2026-03-13 12:08:42.806822321 +0000 UTC m=+1310.086475778" Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.825627 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5794-account-create-update-5vccb" podStartSLOduration=1.825605792 podStartE2EDuration="1.825605792s" podCreationTimestamp="2026-03-13 12:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:42.821226094 +0000 UTC m=+1310.100879541" watchObservedRunningTime="2026-03-13 12:08:42.825605792 +0000 UTC m=+1310.105259249" Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:42.845670 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bsjm4" podStartSLOduration=1.8456542759999999 podStartE2EDuration="1.845654276s" podCreationTimestamp="2026-03-13 12:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:42.843834087 +0000 UTC m=+1310.123487534" watchObservedRunningTime="2026-03-13 12:08:42.845654276 +0000 UTC m=+1310.125307723" Mar 13 12:08:43 crc kubenswrapper[4786]: W0313 12:08:43.610013 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ca1e192_7715_4bd6_b4b3_d6e6912b319c.slice/crio-11ed255f9db0be755c808cd93f2fe43448a37bafdfe8eb279d7fda8d84e34e27 WatchSource:0}: Error finding container 11ed255f9db0be755c808cd93f2fe43448a37bafdfe8eb279d7fda8d84e34e27: Status 404 returned error can't find the container with id 11ed255f9db0be755c808cd93f2fe43448a37bafdfe8eb279d7fda8d84e34e27 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.619269 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d64c-account-create-update-zbfj6"] Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.800351 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fd3434b-b358-4463-a081-511dd7a3469d" containerID="c47dabf207c6254379ea6b9544893c6b3c97efea63726763b4c7bf4d8a82c764" exitCode=0 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.800383 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lrq7b" event={"ID":"4fd3434b-b358-4463-a081-511dd7a3469d","Type":"ContainerDied","Data":"c47dabf207c6254379ea6b9544893c6b3c97efea63726763b4c7bf4d8a82c764"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.804367 4786 generic.go:334] "Generic (PLEG): container finished" podID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerID="aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d" exitCode=0 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.804439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b196d91-2a1f-4ee5-81d5-0133f2917cc5","Type":"ContainerDied","Data":"aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.806015 4786 generic.go:334] "Generic (PLEG): container finished" podID="87da50ad-41f1-4208-b36c-d874ac2250c7" containerID="a80da01ce4b12b8ab209d5616c1d6a41c1d72b0e70b32ddcb0d4f8b3a0f1fd69" exitCode=0 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.806131 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bsjm4" event={"ID":"87da50ad-41f1-4208-b36c-d874ac2250c7","Type":"ContainerDied","Data":"a80da01ce4b12b8ab209d5616c1d6a41c1d72b0e70b32ddcb0d4f8b3a0f1fd69"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.807292 4786 generic.go:334] "Generic (PLEG): container finished" podID="42ce55b0-1082-49c8-b648-92425775ed24" containerID="a95a3ffb3a7cfd95fc50a3707e38a7f96e07f8e41e18b54da33b225a494389a7" exitCode=0 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.807362 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tjh4s" event={"ID":"42ce55b0-1082-49c8-b648-92425775ed24","Type":"ContainerDied","Data":"a95a3ffb3a7cfd95fc50a3707e38a7f96e07f8e41e18b54da33b225a494389a7"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.808737 4786 generic.go:334] "Generic (PLEG): container finished" podID="ffbc9776-7662-41d9-93ac-38c5b98709ab" containerID="3e8ce702d9966b96d3a99192c9116bb2858d49fd20d91333e6186efe37b8fe1e" exitCode=0 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.808839 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5794-account-create-update-5vccb" event={"ID":"ffbc9776-7662-41d9-93ac-38c5b98709ab","Type":"ContainerDied","Data":"3e8ce702d9966b96d3a99192c9116bb2858d49fd20d91333e6186efe37b8fe1e"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.812237 4786 generic.go:334] "Generic (PLEG): container finished" podID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerID="27d6eb8401490fb55d774c4f395089b4fb75b0cc2244cbaf43b5759b74129ca6" exitCode=0 Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.812343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53fea24b-7ca8-4c0a-96d1-458ca1e877a7","Type":"ContainerDied","Data":"27d6eb8401490fb55d774c4f395089b4fb75b0cc2244cbaf43b5759b74129ca6"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.829748 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d64c-account-create-update-zbfj6" event={"ID":"3ca1e192-7715-4bd6-b4b3-d6e6912b319c","Type":"ContainerStarted","Data":"06cbe6fcaf2610a101d910170cbceb0e2a4ec74893cf6c2d3fde4fd12608b429"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.829796 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d64c-account-create-update-zbfj6" event={"ID":"3ca1e192-7715-4bd6-b4b3-d6e6912b319c","Type":"ContainerStarted","Data":"11ed255f9db0be755c808cd93f2fe43448a37bafdfe8eb279d7fda8d84e34e27"} Mar 13 12:08:43 crc kubenswrapper[4786]: I0313 12:08:43.957388 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d64c-account-create-update-zbfj6" podStartSLOduration=2.9573707369999997 podStartE2EDuration="2.957370737s" podCreationTimestamp="2026-03-13 12:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:43.947184976 +0000 UTC m=+1311.226838443" watchObservedRunningTime="2026-03-13 12:08:43.957370737 +0000 UTC m=+1311.237024184" Mar 13 12:08:44 crc kubenswrapper[4786]: I0313 12:08:44.840023 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b196d91-2a1f-4ee5-81d5-0133f2917cc5","Type":"ContainerStarted","Data":"6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943"} Mar 13 12:08:44 crc kubenswrapper[4786]: I0313 12:08:44.842315 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53fea24b-7ca8-4c0a-96d1-458ca1e877a7","Type":"ContainerStarted","Data":"8546d15d615043030d104f666fcccae710b91eaabc4b545097a038170b3a7dcf"} Mar 13 12:08:44 crc kubenswrapper[4786]: I0313 12:08:44.891117 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.247235276 podStartE2EDuration="59.891093978s" podCreationTimestamp="2026-03-13 12:07:45 +0000 UTC" firstStartedPulling="2026-03-13 12:07:47.825548439 +0000 UTC m=+1255.105201886" lastFinishedPulling="2026-03-13 12:08:09.469407141 +0000 UTC m=+1276.749060588" observedRunningTime="2026-03-13 12:08:44.884768179 +0000 UTC m=+1312.164421666" watchObservedRunningTime="2026-03-13 12:08:44.891093978 +0000 UTC m=+1312.170747435" Mar 13 12:08:44 crc kubenswrapper[4786]: I0313 12:08:44.917338 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371976.937456 podStartE2EDuration="59.917319707s" podCreationTimestamp="2026-03-13 12:07:45 +0000 UTC" firstStartedPulling="2026-03-13 12:07:47.721348323 +0000 UTC m=+1255.001001770" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:44.915491577 +0000 UTC m=+1312.195145044" watchObservedRunningTime="2026-03-13 12:08:44.917319707 +0000 UTC m=+1312.196973144" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.235403 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.336076 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87da50ad-41f1-4208-b36c-d874ac2250c7-operator-scripts\") pod \"87da50ad-41f1-4208-b36c-d874ac2250c7\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.336520 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrkc8\" (UniqueName: \"kubernetes.io/projected/87da50ad-41f1-4208-b36c-d874ac2250c7-kube-api-access-lrkc8\") pod \"87da50ad-41f1-4208-b36c-d874ac2250c7\" (UID: \"87da50ad-41f1-4208-b36c-d874ac2250c7\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.336603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87da50ad-41f1-4208-b36c-d874ac2250c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87da50ad-41f1-4208-b36c-d874ac2250c7" (UID: "87da50ad-41f1-4208-b36c-d874ac2250c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.336898 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87da50ad-41f1-4208-b36c-d874ac2250c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.341839 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87da50ad-41f1-4208-b36c-d874ac2250c7-kube-api-access-lrkc8" (OuterVolumeSpecName: "kube-api-access-lrkc8") pod "87da50ad-41f1-4208-b36c-d874ac2250c7" (UID: "87da50ad-41f1-4208-b36c-d874ac2250c7"). InnerVolumeSpecName "kube-api-access-lrkc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.381283 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.389006 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.393656 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454026 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-swiftconf\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454067 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ce55b0-1082-49c8-b648-92425775ed24-operator-scripts\") pod \"42ce55b0-1082-49c8-b648-92425775ed24\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4fd3434b-b358-4463-a081-511dd7a3469d-etc-swift\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-scripts\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454187 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k555\" (UniqueName: \"kubernetes.io/projected/42ce55b0-1082-49c8-b648-92425775ed24-kube-api-access-6k555\") pod \"42ce55b0-1082-49c8-b648-92425775ed24\" (UID: \"42ce55b0-1082-49c8-b648-92425775ed24\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454217 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5cmq\" (UniqueName: \"kubernetes.io/projected/4fd3434b-b358-4463-a081-511dd7a3469d-kube-api-access-j5cmq\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454372 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-ring-data-devices\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454412 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-combined-ca-bundle\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454452 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsbkr\" (UniqueName: \"kubernetes.io/projected/ffbc9776-7662-41d9-93ac-38c5b98709ab-kube-api-access-nsbkr\") pod \"ffbc9776-7662-41d9-93ac-38c5b98709ab\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454483 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc9776-7662-41d9-93ac-38c5b98709ab-operator-scripts\") pod \"ffbc9776-7662-41d9-93ac-38c5b98709ab\" (UID: \"ffbc9776-7662-41d9-93ac-38c5b98709ab\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454537 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-dispersionconf\") pod \"4fd3434b-b358-4463-a081-511dd7a3469d\" (UID: \"4fd3434b-b358-4463-a081-511dd7a3469d\") " Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454807 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ce55b0-1082-49c8-b648-92425775ed24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42ce55b0-1082-49c8-b648-92425775ed24" (UID: "42ce55b0-1082-49c8-b648-92425775ed24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.454988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fd3434b-b358-4463-a081-511dd7a3469d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.455133 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ce55b0-1082-49c8-b648-92425775ed24-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.455154 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4fd3434b-b358-4463-a081-511dd7a3469d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.455167 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrkc8\" (UniqueName: \"kubernetes.io/projected/87da50ad-41f1-4208-b36c-d874ac2250c7-kube-api-access-lrkc8\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.455493 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbc9776-7662-41d9-93ac-38c5b98709ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffbc9776-7662-41d9-93ac-38c5b98709ab" (UID: "ffbc9776-7662-41d9-93ac-38c5b98709ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.455766 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.461858 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbc9776-7662-41d9-93ac-38c5b98709ab-kube-api-access-nsbkr" (OuterVolumeSpecName: "kube-api-access-nsbkr") pod "ffbc9776-7662-41d9-93ac-38c5b98709ab" (UID: "ffbc9776-7662-41d9-93ac-38c5b98709ab"). InnerVolumeSpecName "kube-api-access-nsbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.462403 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ce55b0-1082-49c8-b648-92425775ed24-kube-api-access-6k555" (OuterVolumeSpecName: "kube-api-access-6k555") pod "42ce55b0-1082-49c8-b648-92425775ed24" (UID: "42ce55b0-1082-49c8-b648-92425775ed24"). InnerVolumeSpecName "kube-api-access-6k555". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.464319 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd3434b-b358-4463-a081-511dd7a3469d-kube-api-access-j5cmq" (OuterVolumeSpecName: "kube-api-access-j5cmq") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "kube-api-access-j5cmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.477445 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.479227 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.482799 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.487403 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-scripts" (OuterVolumeSpecName: "scripts") pod "4fd3434b-b358-4463-a081-511dd7a3469d" (UID: "4fd3434b-b358-4463-a081-511dd7a3469d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556761 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k555\" (UniqueName: \"kubernetes.io/projected/42ce55b0-1082-49c8-b648-92425775ed24-kube-api-access-6k555\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556800 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5cmq\" (UniqueName: \"kubernetes.io/projected/4fd3434b-b358-4463-a081-511dd7a3469d-kube-api-access-j5cmq\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556813 4786 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556824 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556836 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsbkr\" (UniqueName: \"kubernetes.io/projected/ffbc9776-7662-41d9-93ac-38c5b98709ab-kube-api-access-nsbkr\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556847 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc9776-7662-41d9-93ac-38c5b98709ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556856 4786 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556870 4786 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4fd3434b-b358-4463-a081-511dd7a3469d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.556901 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd3434b-b358-4463-a081-511dd7a3469d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721140 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rhxlw"] Mar 13 12:08:45 crc kubenswrapper[4786]: E0313 12:08:45.721498 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ce55b0-1082-49c8-b648-92425775ed24" containerName="mariadb-database-create" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721524 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ce55b0-1082-49c8-b648-92425775ed24" containerName="mariadb-database-create" Mar 13 12:08:45 crc kubenswrapper[4786]: E0313 12:08:45.721539 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87da50ad-41f1-4208-b36c-d874ac2250c7" containerName="mariadb-database-create" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721547 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="87da50ad-41f1-4208-b36c-d874ac2250c7" containerName="mariadb-database-create" Mar 13 12:08:45 crc kubenswrapper[4786]: E0313 12:08:45.721566 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbc9776-7662-41d9-93ac-38c5b98709ab" containerName="mariadb-account-create-update" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721574 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbc9776-7662-41d9-93ac-38c5b98709ab" containerName="mariadb-account-create-update" Mar 13 12:08:45 crc kubenswrapper[4786]: E0313 12:08:45.721590 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd3434b-b358-4463-a081-511dd7a3469d" containerName="swift-ring-rebalance" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721599 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd3434b-b358-4463-a081-511dd7a3469d" containerName="swift-ring-rebalance" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721771 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="87da50ad-41f1-4208-b36c-d874ac2250c7" containerName="mariadb-database-create" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721801 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ce55b0-1082-49c8-b648-92425775ed24" containerName="mariadb-database-create" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721815 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd3434b-b358-4463-a081-511dd7a3469d" containerName="swift-ring-rebalance" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.721826 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbc9776-7662-41d9-93ac-38c5b98709ab" containerName="mariadb-account-create-update" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.722389 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.731630 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rhxlw"] Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.829577 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fef1-account-create-update-qx49r"] Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.830505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.832157 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.852120 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lrq7b" event={"ID":"4fd3434b-b358-4463-a081-511dd7a3469d","Type":"ContainerDied","Data":"3aeb023e741a7ee941e31e761e3d59989cb76e5e491edbbee89a4c98dda417f3"} Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.852158 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aeb023e741a7ee941e31e761e3d59989cb76e5e491edbbee89a4c98dda417f3" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.852165 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lrq7b" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.854540 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fef1-account-create-update-qx49r"] Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.858663 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bsjm4" event={"ID":"87da50ad-41f1-4208-b36c-d874ac2250c7","Type":"ContainerDied","Data":"dabcb6fe3f7d3a3070f5b7d1fc4f34f064275b8528b6b0e026c3b81a3b025e3a"} Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.858710 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dabcb6fe3f7d3a3070f5b7d1fc4f34f064275b8528b6b0e026c3b81a3b025e3a" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.858801 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bsjm4" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.863377 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tjh4s" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.863474 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tjh4s" event={"ID":"42ce55b0-1082-49c8-b648-92425775ed24","Type":"ContainerDied","Data":"934905257e954c1b207938a1681f406d83f6b1e1188d934bd85fc6eed09101b1"} Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.863517 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934905257e954c1b207938a1681f406d83f6b1e1188d934bd85fc6eed09101b1" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.864951 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzkvd\" (UniqueName: \"kubernetes.io/projected/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-kube-api-access-bzkvd\") pod \"glance-db-create-rhxlw\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.865005 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-operator-scripts\") pod \"glance-db-create-rhxlw\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.869525 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-5vccb" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.869267 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5794-account-create-update-5vccb" event={"ID":"ffbc9776-7662-41d9-93ac-38c5b98709ab","Type":"ContainerDied","Data":"00fa36bcf118f74dc28c36600ba5d001444cbb1074e7377be6593738bfa85e0a"} Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.869735 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fa36bcf118f74dc28c36600ba5d001444cbb1074e7377be6593738bfa85e0a" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.966485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzkvd\" (UniqueName: \"kubernetes.io/projected/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-kube-api-access-bzkvd\") pod \"glance-db-create-rhxlw\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.966538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-operator-scripts\") pod \"glance-db-create-rhxlw\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.966627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa010b2d-a4cf-4646-b289-54e0a6e285dd-operator-scripts\") pod \"glance-fef1-account-create-update-qx49r\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.966665 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfglf\" (UniqueName: \"kubernetes.io/projected/fa010b2d-a4cf-4646-b289-54e0a6e285dd-kube-api-access-xfglf\") pod \"glance-fef1-account-create-update-qx49r\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.967385 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-operator-scripts\") pod \"glance-db-create-rhxlw\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:45 crc kubenswrapper[4786]: I0313 12:08:45.984828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzkvd\" (UniqueName: \"kubernetes.io/projected/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-kube-api-access-bzkvd\") pod \"glance-db-create-rhxlw\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.048939 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.068632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa010b2d-a4cf-4646-b289-54e0a6e285dd-operator-scripts\") pod \"glance-fef1-account-create-update-qx49r\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.068693 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfglf\" (UniqueName: \"kubernetes.io/projected/fa010b2d-a4cf-4646-b289-54e0a6e285dd-kube-api-access-xfglf\") pod \"glance-fef1-account-create-update-qx49r\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.069513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa010b2d-a4cf-4646-b289-54e0a6e285dd-operator-scripts\") pod \"glance-fef1-account-create-update-qx49r\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.084585 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfglf\" (UniqueName: \"kubernetes.io/projected/fa010b2d-a4cf-4646-b289-54e0a6e285dd-kube-api-access-xfglf\") pod \"glance-fef1-account-create-update-qx49r\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.152443 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.481530 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fef1-account-create-update-qx49r"] Mar 13 12:08:46 crc kubenswrapper[4786]: W0313 12:08:46.494037 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa010b2d_a4cf_4646_b289_54e0a6e285dd.slice/crio-a7479e77189a18348ce0f52b5a2b16341c85a5f61a5dc4d02716c63dbb108d5c WatchSource:0}: Error finding container a7479e77189a18348ce0f52b5a2b16341c85a5f61a5dc4d02716c63dbb108d5c: Status 404 returned error can't find the container with id a7479e77189a18348ce0f52b5a2b16341c85a5f61a5dc4d02716c63dbb108d5c Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.610060 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rhxlw"] Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.850648 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.876777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rhxlw" event={"ID":"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd","Type":"ContainerStarted","Data":"7cb0f6eb6e2738f3a7bb9e6c1037e4d0e724978388c10e0835185e8b2d1098cd"} Mar 13 12:08:46 crc kubenswrapper[4786]: I0313 12:08:46.877813 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fef1-account-create-update-qx49r" event={"ID":"fa010b2d-a4cf-4646-b289-54e0a6e285dd","Type":"ContainerStarted","Data":"a7479e77189a18348ce0f52b5a2b16341c85a5f61a5dc4d02716c63dbb108d5c"} Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.342145 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.508419 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j2d2d"] Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.514728 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j2d2d"] Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.586270 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gpn45"] Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.587244 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.601472 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gpn45"] Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.624028 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.699042 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d04659-fd3d-4668-9ad5-51824cb9760a-operator-scripts\") pod \"root-account-create-update-gpn45\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.699216 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsss\" (UniqueName: \"kubernetes.io/projected/b7d04659-fd3d-4668-9ad5-51824cb9760a-kube-api-access-4hsss\") pod \"root-account-create-update-gpn45\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.801369 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsss\" (UniqueName: \"kubernetes.io/projected/b7d04659-fd3d-4668-9ad5-51824cb9760a-kube-api-access-4hsss\") pod \"root-account-create-update-gpn45\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.801513 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d04659-fd3d-4668-9ad5-51824cb9760a-operator-scripts\") pod \"root-account-create-update-gpn45\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.802445 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d04659-fd3d-4668-9ad5-51824cb9760a-operator-scripts\") pod \"root-account-create-update-gpn45\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.835628 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsss\" (UniqueName: \"kubernetes.io/projected/b7d04659-fd3d-4668-9ad5-51824cb9760a-kube-api-access-4hsss\") pod \"root-account-create-update-gpn45\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:47 crc kubenswrapper[4786]: I0313 12:08:47.972190 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.229505 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-666xn" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:08:48 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:08:48 crc kubenswrapper[4786]: > Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.247294 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.250623 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.453251 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gpn45"] Mar 13 12:08:48 crc kubenswrapper[4786]: W0313 12:08:48.463778 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d04659_fd3d_4668_9ad5_51824cb9760a.slice/crio-e0809b50e91161b4fdec7c81fa5ac0e348a4ca9ea8f0db784fd7f0cdd90fe8a7 WatchSource:0}: Error finding container e0809b50e91161b4fdec7c81fa5ac0e348a4ca9ea8f0db784fd7f0cdd90fe8a7: Status 404 returned error can't find the container with id e0809b50e91161b4fdec7c81fa5ac0e348a4ca9ea8f0db784fd7f0cdd90fe8a7 Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.466469 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-666xn-config-sxrrn"] Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.467612 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.469592 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.492003 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn-config-sxrrn"] Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.518584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.518697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-additional-scripts\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.518837 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run-ovn\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.518947 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbdhv\" (UniqueName: \"kubernetes.io/projected/633baa3a-ae1a-4b77-9e96-217360f53a43-kube-api-access-fbdhv\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.519658 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-log-ovn\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.519760 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-scripts\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.621150 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.621447 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-additional-scripts\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.621480 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run-ovn\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.621534 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.621636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run-ovn\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.622102 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-additional-scripts\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.622187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbdhv\" (UniqueName: \"kubernetes.io/projected/633baa3a-ae1a-4b77-9e96-217360f53a43-kube-api-access-fbdhv\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.622247 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-log-ovn\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.622560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-scripts\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.622417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-log-ovn\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.624439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-scripts\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.643404 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbdhv\" (UniqueName: \"kubernetes.io/projected/633baa3a-ae1a-4b77-9e96-217360f53a43-kube-api-access-fbdhv\") pod \"ovn-controller-666xn-config-sxrrn\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.871214 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.893449 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpn45" event={"ID":"b7d04659-fd3d-4668-9ad5-51824cb9760a","Type":"ContainerStarted","Data":"e0809b50e91161b4fdec7c81fa5ac0e348a4ca9ea8f0db784fd7f0cdd90fe8a7"} Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.895011 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rhxlw" event={"ID":"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd","Type":"ContainerStarted","Data":"39984635e5585195e1c8d56b7a28c676d85cf6623b602e27e486afa71454a871"} Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.896527 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fef1-account-create-update-qx49r" event={"ID":"fa010b2d-a4cf-4646-b289-54e0a6e285dd","Type":"ContainerStarted","Data":"85d10348665228b36bb7af791198dbb74fe9596aaf08a0b9b13a8d28493b4749"} Mar 13 12:08:48 crc kubenswrapper[4786]: I0313 12:08:48.920489 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-fef1-account-create-update-qx49r" podStartSLOduration=3.920472014 podStartE2EDuration="3.920472014s" podCreationTimestamp="2026-03-13 12:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:48.912421949 +0000 UTC m=+1316.192075396" watchObservedRunningTime="2026-03-13 12:08:48.920472014 +0000 UTC m=+1316.200125461" Mar 13 12:08:49 crc kubenswrapper[4786]: I0313 12:08:49.331737 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn-config-sxrrn"] Mar 13 12:08:49 crc kubenswrapper[4786]: W0313 12:08:49.338331 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod633baa3a_ae1a_4b77_9e96_217360f53a43.slice/crio-47ea18ecb6ba8610ddafc82458a73678b7731739889250d652fb90e0de677647 WatchSource:0}: Error finding container 47ea18ecb6ba8610ddafc82458a73678b7731739889250d652fb90e0de677647: Status 404 returned error can't find the container with id 47ea18ecb6ba8610ddafc82458a73678b7731739889250d652fb90e0de677647 Mar 13 12:08:49 crc kubenswrapper[4786]: I0313 12:08:49.454358 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0" path="/var/lib/kubelet/pods/564520fb-07f0-4e6d-bfa9-cbf7c2fd7af0/volumes" Mar 13 12:08:49 crc kubenswrapper[4786]: I0313 12:08:49.911301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-sxrrn" event={"ID":"633baa3a-ae1a-4b77-9e96-217360f53a43","Type":"ContainerStarted","Data":"47ea18ecb6ba8610ddafc82458a73678b7731739889250d652fb90e0de677647"} Mar 13 12:08:49 crc kubenswrapper[4786]: I0313 12:08:49.935429 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-rhxlw" podStartSLOduration=4.935403677 podStartE2EDuration="4.935403677s" podCreationTimestamp="2026-03-13 12:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:49.932834719 +0000 UTC m=+1317.212488196" watchObservedRunningTime="2026-03-13 12:08:49.935403677 +0000 UTC m=+1317.215057134" Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.923466 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-sxrrn" event={"ID":"633baa3a-ae1a-4b77-9e96-217360f53a43","Type":"ContainerStarted","Data":"366f7bdf5192896510fd5be7476e2f1fed3d2e1a33505671ebc219e89e72f9db"} Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.925225 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpn45" event={"ID":"b7d04659-fd3d-4668-9ad5-51824cb9760a","Type":"ContainerStarted","Data":"af9a58bcbaa96db6c049cf3f0beded05596724e061992aa2269d55f6577c1329"} Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.927974 4786 generic.go:334] "Generic (PLEG): container finished" podID="3ca1e192-7715-4bd6-b4b3-d6e6912b319c" containerID="06cbe6fcaf2610a101d910170cbceb0e2a4ec74893cf6c2d3fde4fd12608b429" exitCode=0 Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.928016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d64c-account-create-update-zbfj6" event={"ID":"3ca1e192-7715-4bd6-b4b3-d6e6912b319c","Type":"ContainerDied","Data":"06cbe6fcaf2610a101d910170cbceb0e2a4ec74893cf6c2d3fde4fd12608b429"} Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.930360 4786 generic.go:334] "Generic (PLEG): container finished" podID="befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" containerID="39984635e5585195e1c8d56b7a28c676d85cf6623b602e27e486afa71454a871" exitCode=0 Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.930602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rhxlw" event={"ID":"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd","Type":"ContainerDied","Data":"39984635e5585195e1c8d56b7a28c676d85cf6623b602e27e486afa71454a871"} Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.951301 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-666xn-config-sxrrn" podStartSLOduration=2.951280386 podStartE2EDuration="2.951280386s" podCreationTimestamp="2026-03-13 12:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:50.942318047 +0000 UTC m=+1318.221971504" watchObservedRunningTime="2026-03-13 12:08:50.951280386 +0000 UTC m=+1318.230933853" Mar 13 12:08:50 crc kubenswrapper[4786]: I0313 12:08:50.989947 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gpn45" podStartSLOduration=3.989927015 podStartE2EDuration="3.989927015s" podCreationTimestamp="2026-03-13 12:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:50.987909681 +0000 UTC m=+1318.267563138" watchObservedRunningTime="2026-03-13 12:08:50.989927015 +0000 UTC m=+1318.269580472" Mar 13 12:08:51 crc kubenswrapper[4786]: I0313 12:08:51.938048 4786 generic.go:334] "Generic (PLEG): container finished" podID="b7d04659-fd3d-4668-9ad5-51824cb9760a" containerID="af9a58bcbaa96db6c049cf3f0beded05596724e061992aa2269d55f6577c1329" exitCode=0 Mar 13 12:08:51 crc kubenswrapper[4786]: I0313 12:08:51.938128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpn45" event={"ID":"b7d04659-fd3d-4668-9ad5-51824cb9760a","Type":"ContainerDied","Data":"af9a58bcbaa96db6c049cf3f0beded05596724e061992aa2269d55f6577c1329"} Mar 13 12:08:51 crc kubenswrapper[4786]: I0313 12:08:51.939633 4786 generic.go:334] "Generic (PLEG): container finished" podID="fa010b2d-a4cf-4646-b289-54e0a6e285dd" containerID="85d10348665228b36bb7af791198dbb74fe9596aaf08a0b9b13a8d28493b4749" exitCode=0 Mar 13 12:08:51 crc kubenswrapper[4786]: I0313 12:08:51.939675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fef1-account-create-update-qx49r" event={"ID":"fa010b2d-a4cf-4646-b289-54e0a6e285dd","Type":"ContainerDied","Data":"85d10348665228b36bb7af791198dbb74fe9596aaf08a0b9b13a8d28493b4749"} Mar 13 12:08:51 crc kubenswrapper[4786]: I0313 12:08:51.941080 4786 generic.go:334] "Generic (PLEG): container finished" podID="633baa3a-ae1a-4b77-9e96-217360f53a43" containerID="366f7bdf5192896510fd5be7476e2f1fed3d2e1a33505671ebc219e89e72f9db" exitCode=0 Mar 13 12:08:51 crc kubenswrapper[4786]: I0313 12:08:51.941125 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-sxrrn" event={"ID":"633baa3a-ae1a-4b77-9e96-217360f53a43","Type":"ContainerDied","Data":"366f7bdf5192896510fd5be7476e2f1fed3d2e1a33505671ebc219e89e72f9db"} Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.284936 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.370284 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.421428 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzkvd\" (UniqueName: \"kubernetes.io/projected/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-kube-api-access-bzkvd\") pod \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.421549 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-operator-scripts\") pod \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\" (UID: \"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd\") " Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.422135 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" (UID: "befdb8e3-7615-4bd4-a6f8-dfa11bd924bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.428093 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-kube-api-access-bzkvd" (OuterVolumeSpecName: "kube-api-access-bzkvd") pod "befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" (UID: "befdb8e3-7615-4bd4-a6f8-dfa11bd924bd"). InnerVolumeSpecName "kube-api-access-bzkvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.522575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-operator-scripts\") pod \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.522719 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2njk\" (UniqueName: \"kubernetes.io/projected/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-kube-api-access-f2njk\") pod \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\" (UID: \"3ca1e192-7715-4bd6-b4b3-d6e6912b319c\") " Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.523315 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ca1e192-7715-4bd6-b4b3-d6e6912b319c" (UID: "3ca1e192-7715-4bd6-b4b3-d6e6912b319c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.523457 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzkvd\" (UniqueName: \"kubernetes.io/projected/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-kube-api-access-bzkvd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.523481 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.523494 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.526060 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-kube-api-access-f2njk" (OuterVolumeSpecName: "kube-api-access-f2njk") pod "3ca1e192-7715-4bd6-b4b3-d6e6912b319c" (UID: "3ca1e192-7715-4bd6-b4b3-d6e6912b319c"). InnerVolumeSpecName "kube-api-access-f2njk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.624605 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2njk\" (UniqueName: \"kubernetes.io/projected/3ca1e192-7715-4bd6-b4b3-d6e6912b319c-kube-api-access-f2njk\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.952377 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d64c-account-create-update-zbfj6" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.952378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d64c-account-create-update-zbfj6" event={"ID":"3ca1e192-7715-4bd6-b4b3-d6e6912b319c","Type":"ContainerDied","Data":"11ed255f9db0be755c808cd93f2fe43448a37bafdfe8eb279d7fda8d84e34e27"} Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.952932 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ed255f9db0be755c808cd93f2fe43448a37bafdfe8eb279d7fda8d84e34e27" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.954692 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rhxlw" Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.956591 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rhxlw" event={"ID":"befdb8e3-7615-4bd4-a6f8-dfa11bd924bd","Type":"ContainerDied","Data":"7cb0f6eb6e2738f3a7bb9e6c1037e4d0e724978388c10e0835185e8b2d1098cd"} Mar 13 12:08:52 crc kubenswrapper[4786]: I0313 12:08:52.956629 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb0f6eb6e2738f3a7bb9e6c1037e4d0e724978388c10e0835185e8b2d1098cd" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.254063 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-666xn" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.424136 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.428504 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.433766 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538556 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfglf\" (UniqueName: \"kubernetes.io/projected/fa010b2d-a4cf-4646-b289-54e0a6e285dd-kube-api-access-xfglf\") pod \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538803 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-additional-scripts\") pod \"633baa3a-ae1a-4b77-9e96-217360f53a43\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538822 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-log-ovn\") pod \"633baa3a-ae1a-4b77-9e96-217360f53a43\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538853 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-scripts\") pod \"633baa3a-ae1a-4b77-9e96-217360f53a43\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538869 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbdhv\" (UniqueName: \"kubernetes.io/projected/633baa3a-ae1a-4b77-9e96-217360f53a43-kube-api-access-fbdhv\") pod \"633baa3a-ae1a-4b77-9e96-217360f53a43\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538915 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run-ovn\") pod \"633baa3a-ae1a-4b77-9e96-217360f53a43\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.538983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "633baa3a-ae1a-4b77-9e96-217360f53a43" (UID: "633baa3a-ae1a-4b77-9e96-217360f53a43"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa010b2d-a4cf-4646-b289-54e0a6e285dd-operator-scripts\") pod \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\" (UID: \"fa010b2d-a4cf-4646-b289-54e0a6e285dd\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539078 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run\") pod \"633baa3a-ae1a-4b77-9e96-217360f53a43\" (UID: \"633baa3a-ae1a-4b77-9e96-217360f53a43\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539106 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hsss\" (UniqueName: \"kubernetes.io/projected/b7d04659-fd3d-4668-9ad5-51824cb9760a-kube-api-access-4hsss\") pod \"b7d04659-fd3d-4668-9ad5-51824cb9760a\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539121 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d04659-fd3d-4668-9ad5-51824cb9760a-operator-scripts\") pod \"b7d04659-fd3d-4668-9ad5-51824cb9760a\" (UID: \"b7d04659-fd3d-4668-9ad5-51824cb9760a\") " Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539318 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "633baa3a-ae1a-4b77-9e96-217360f53a43" (UID: "633baa3a-ae1a-4b77-9e96-217360f53a43"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539371 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "633baa3a-ae1a-4b77-9e96-217360f53a43" (UID: "633baa3a-ae1a-4b77-9e96-217360f53a43"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run" (OuterVolumeSpecName: "var-run") pod "633baa3a-ae1a-4b77-9e96-217360f53a43" (UID: "633baa3a-ae1a-4b77-9e96-217360f53a43"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539710 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539732 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.539743 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.540000 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-scripts" (OuterVolumeSpecName: "scripts") pod "633baa3a-ae1a-4b77-9e96-217360f53a43" (UID: "633baa3a-ae1a-4b77-9e96-217360f53a43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.540022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d04659-fd3d-4668-9ad5-51824cb9760a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7d04659-fd3d-4668-9ad5-51824cb9760a" (UID: "b7d04659-fd3d-4668-9ad5-51824cb9760a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.540141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa010b2d-a4cf-4646-b289-54e0a6e285dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa010b2d-a4cf-4646-b289-54e0a6e285dd" (UID: "fa010b2d-a4cf-4646-b289-54e0a6e285dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.555831 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633baa3a-ae1a-4b77-9e96-217360f53a43-kube-api-access-fbdhv" (OuterVolumeSpecName: "kube-api-access-fbdhv") pod "633baa3a-ae1a-4b77-9e96-217360f53a43" (UID: "633baa3a-ae1a-4b77-9e96-217360f53a43"). InnerVolumeSpecName "kube-api-access-fbdhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.557323 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d04659-fd3d-4668-9ad5-51824cb9760a-kube-api-access-4hsss" (OuterVolumeSpecName: "kube-api-access-4hsss") pod "b7d04659-fd3d-4668-9ad5-51824cb9760a" (UID: "b7d04659-fd3d-4668-9ad5-51824cb9760a"). InnerVolumeSpecName "kube-api-access-4hsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.557710 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa010b2d-a4cf-4646-b289-54e0a6e285dd-kube-api-access-xfglf" (OuterVolumeSpecName: "kube-api-access-xfglf") pod "fa010b2d-a4cf-4646-b289-54e0a6e285dd" (UID: "fa010b2d-a4cf-4646-b289-54e0a6e285dd"). InnerVolumeSpecName "kube-api-access-xfglf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641200 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfglf\" (UniqueName: \"kubernetes.io/projected/fa010b2d-a4cf-4646-b289-54e0a6e285dd-kube-api-access-xfglf\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641255 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/633baa3a-ae1a-4b77-9e96-217360f53a43-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641275 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbdhv\" (UniqueName: \"kubernetes.io/projected/633baa3a-ae1a-4b77-9e96-217360f53a43-kube-api-access-fbdhv\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641294 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa010b2d-a4cf-4646-b289-54e0a6e285dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641311 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/633baa3a-ae1a-4b77-9e96-217360f53a43-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641329 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hsss\" (UniqueName: \"kubernetes.io/projected/b7d04659-fd3d-4668-9ad5-51824cb9760a-kube-api-access-4hsss\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.641351 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d04659-fd3d-4668-9ad5-51824cb9760a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.964302 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fef1-account-create-update-qx49r" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.967335 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fef1-account-create-update-qx49r" event={"ID":"fa010b2d-a4cf-4646-b289-54e0a6e285dd","Type":"ContainerDied","Data":"a7479e77189a18348ce0f52b5a2b16341c85a5f61a5dc4d02716c63dbb108d5c"} Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.967420 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7479e77189a18348ce0f52b5a2b16341c85a5f61a5dc4d02716c63dbb108d5c" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.969919 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-sxrrn" event={"ID":"633baa3a-ae1a-4b77-9e96-217360f53a43","Type":"ContainerDied","Data":"47ea18ecb6ba8610ddafc82458a73678b7731739889250d652fb90e0de677647"} Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.969972 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47ea18ecb6ba8610ddafc82458a73678b7731739889250d652fb90e0de677647" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.969976 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-sxrrn" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.971728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpn45" event={"ID":"b7d04659-fd3d-4668-9ad5-51824cb9760a","Type":"ContainerDied","Data":"e0809b50e91161b4fdec7c81fa5ac0e348a4ca9ea8f0db784fd7f0cdd90fe8a7"} Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.971749 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0809b50e91161b4fdec7c81fa5ac0e348a4ca9ea8f0db784fd7f0cdd90fe8a7" Mar 13 12:08:53 crc kubenswrapper[4786]: I0313 12:08:53.971818 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpn45" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.092236 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-666xn-config-sxrrn"] Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.109350 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-666xn-config-sxrrn"] Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.146769 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-666xn-config-nbnwk"] Mar 13 12:08:54 crc kubenswrapper[4786]: E0313 12:08:54.147185 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633baa3a-ae1a-4b77-9e96-217360f53a43" containerName="ovn-config" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147208 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="633baa3a-ae1a-4b77-9e96-217360f53a43" containerName="ovn-config" Mar 13 12:08:54 crc kubenswrapper[4786]: E0313 12:08:54.147232 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" containerName="mariadb-database-create" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147240 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" containerName="mariadb-database-create" Mar 13 12:08:54 crc kubenswrapper[4786]: E0313 12:08:54.147256 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca1e192-7715-4bd6-b4b3-d6e6912b319c" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147264 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca1e192-7715-4bd6-b4b3-d6e6912b319c" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: E0313 12:08:54.147273 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d04659-fd3d-4668-9ad5-51824cb9760a" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147281 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d04659-fd3d-4668-9ad5-51824cb9760a" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: E0313 12:08:54.147316 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa010b2d-a4cf-4646-b289-54e0a6e285dd" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147324 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa010b2d-a4cf-4646-b289-54e0a6e285dd" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147502 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa010b2d-a4cf-4646-b289-54e0a6e285dd" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147523 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="633baa3a-ae1a-4b77-9e96-217360f53a43" containerName="ovn-config" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147536 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" containerName="mariadb-database-create" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147550 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca1e192-7715-4bd6-b4b3-d6e6912b319c" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.147562 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d04659-fd3d-4668-9ad5-51824cb9760a" containerName="mariadb-account-create-update" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.148185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.152051 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.157771 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn-config-nbnwk"] Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.250460 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-additional-scripts\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.250496 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-scripts\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.250535 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-log-ovn\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.250586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.250617 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run-ovn\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.250644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4sff\" (UniqueName: \"kubernetes.io/projected/db469f94-ad60-4135-a65c-b54808df8307-kube-api-access-c4sff\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-additional-scripts\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-scripts\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352530 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-log-ovn\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352581 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run-ovn\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352660 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4sff\" (UniqueName: \"kubernetes.io/projected/db469f94-ad60-4135-a65c-b54808df8307-kube-api-access-c4sff\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352921 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352921 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run-ovn\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.352995 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-log-ovn\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.354031 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-additional-scripts\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.354677 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-scripts\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.373483 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4sff\" (UniqueName: \"kubernetes.io/projected/db469f94-ad60-4135-a65c-b54808df8307-kube-api-access-c4sff\") pod \"ovn-controller-666xn-config-nbnwk\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.467463 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.904963 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn-config-nbnwk"] Mar 13 12:08:54 crc kubenswrapper[4786]: I0313 12:08:54.992428 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-nbnwk" event={"ID":"db469f94-ad60-4135-a65c-b54808df8307","Type":"ContainerStarted","Data":"eb2e20f78143ed589c0ee2a06e61dd33a6c121c5e73bec62b8e421c52b15fc38"} Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.458488 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="633baa3a-ae1a-4b77-9e96-217360f53a43" path="/var/lib/kubelet/pods/633baa3a-ae1a-4b77-9e96-217360f53a43/volumes" Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.881951 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.889350 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"swift-storage-0\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " pod="openstack/swift-storage-0" Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.914468 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8shl7"] Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.916393 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.921148 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.922187 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8shl7"] Mar 13 12:08:55 crc kubenswrapper[4786]: I0313 12:08:55.923270 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8lmx7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.001016 4786 generic.go:334] "Generic (PLEG): container finished" podID="db469f94-ad60-4135-a65c-b54808df8307" containerID="9ab0287f8e3b9501e6a4aadca9cf4e40efa14d5aa33a94b16e22d38625d8e92c" exitCode=0 Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.001063 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-nbnwk" event={"ID":"db469f94-ad60-4135-a65c-b54808df8307","Type":"ContainerDied","Data":"9ab0287f8e3b9501e6a4aadca9cf4e40efa14d5aa33a94b16e22d38625d8e92c"} Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.085021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-db-sync-config-data\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.085069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbkd\" (UniqueName: \"kubernetes.io/projected/a594fa40-6352-480d-8927-c04bf51c9c51-kube-api-access-dvbkd\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.085086 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-combined-ca-bundle\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.085213 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-config-data\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.106989 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.187259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-config-data\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.187419 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-db-sync-config-data\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.187449 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbkd\" (UniqueName: \"kubernetes.io/projected/a594fa40-6352-480d-8927-c04bf51c9c51-kube-api-access-dvbkd\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.187472 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-combined-ca-bundle\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.191340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-db-sync-config-data\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.191975 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-config-data\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.194421 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-combined-ca-bundle\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.205364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbkd\" (UniqueName: \"kubernetes.io/projected/a594fa40-6352-480d-8927-c04bf51c9c51-kube-api-access-dvbkd\") pod \"glance-db-sync-8shl7\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.273719 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8shl7" Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.678531 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:08:56 crc kubenswrapper[4786]: W0313 12:08:56.821936 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda594fa40_6352_480d_8927_c04bf51c9c51.slice/crio-10933b84917754f26775161eea8e84c490f5925d3d157dfe473aa1b08e695c05 WatchSource:0}: Error finding container 10933b84917754f26775161eea8e84c490f5925d3d157dfe473aa1b08e695c05: Status 404 returned error can't find the container with id 10933b84917754f26775161eea8e84c490f5925d3d157dfe473aa1b08e695c05 Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.827023 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8shl7"] Mar 13 12:08:56 crc kubenswrapper[4786]: I0313 12:08:56.851674 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.010707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"3c393d37dd069e468a39269dad91c2bbb9381829f40438afa1079f94066daae8"} Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.011521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8shl7" event={"ID":"a594fa40-6352-480d-8927-c04bf51c9c51","Type":"ContainerStarted","Data":"10933b84917754f26775161eea8e84c490f5925d3d157dfe473aa1b08e695c05"} Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.277137 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.343775 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410331 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run\") pod \"db469f94-ad60-4135-a65c-b54808df8307\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410440 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-log-ovn\") pod \"db469f94-ad60-4135-a65c-b54808df8307\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-scripts\") pod \"db469f94-ad60-4135-a65c-b54808df8307\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410558 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4sff\" (UniqueName: \"kubernetes.io/projected/db469f94-ad60-4135-a65c-b54808df8307-kube-api-access-c4sff\") pod \"db469f94-ad60-4135-a65c-b54808df8307\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410585 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run-ovn\") pod \"db469f94-ad60-4135-a65c-b54808df8307\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410635 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-additional-scripts\") pod \"db469f94-ad60-4135-a65c-b54808df8307\" (UID: \"db469f94-ad60-4135-a65c-b54808df8307\") " Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410908 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "db469f94-ad60-4135-a65c-b54808df8307" (UID: "db469f94-ad60-4135-a65c-b54808df8307"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.410966 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run" (OuterVolumeSpecName: "var-run") pod "db469f94-ad60-4135-a65c-b54808df8307" (UID: "db469f94-ad60-4135-a65c-b54808df8307"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.411019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "db469f94-ad60-4135-a65c-b54808df8307" (UID: "db469f94-ad60-4135-a65c-b54808df8307"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.411559 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "db469f94-ad60-4135-a65c-b54808df8307" (UID: "db469f94-ad60-4135-a65c-b54808df8307"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.411727 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-scripts" (OuterVolumeSpecName: "scripts") pod "db469f94-ad60-4135-a65c-b54808df8307" (UID: "db469f94-ad60-4135-a65c-b54808df8307"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.415831 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db469f94-ad60-4135-a65c-b54808df8307-kube-api-access-c4sff" (OuterVolumeSpecName: "kube-api-access-c4sff") pod "db469f94-ad60-4135-a65c-b54808df8307" (UID: "db469f94-ad60-4135-a65c-b54808df8307"). InnerVolumeSpecName "kube-api-access-c4sff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.512223 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.512417 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.512475 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4sff\" (UniqueName: \"kubernetes.io/projected/db469f94-ad60-4135-a65c-b54808df8307-kube-api-access-c4sff\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.512534 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.512585 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/db469f94-ad60-4135-a65c-b54808df8307-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:57 crc kubenswrapper[4786]: I0313 12:08:57.512635 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db469f94-ad60-4135-a65c-b54808df8307-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.028835 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-nbnwk" event={"ID":"db469f94-ad60-4135-a65c-b54808df8307","Type":"ContainerDied","Data":"eb2e20f78143ed589c0ee2a06e61dd33a6c121c5e73bec62b8e421c52b15fc38"} Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.028933 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb2e20f78143ed589c0ee2a06e61dd33a6c121c5e73bec62b8e421c52b15fc38" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.029026 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-nbnwk" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.372944 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-666xn-config-nbnwk"] Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.382778 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-666xn-config-nbnwk"] Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.513720 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-666xn-config-5fxw6"] Mar 13 12:08:58 crc kubenswrapper[4786]: E0313 12:08:58.520398 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db469f94-ad60-4135-a65c-b54808df8307" containerName="ovn-config" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.520427 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="db469f94-ad60-4135-a65c-b54808df8307" containerName="ovn-config" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.520933 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="db469f94-ad60-4135-a65c-b54808df8307" containerName="ovn-config" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.521694 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.528960 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.538114 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn-config-5fxw6"] Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.643955 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-additional-scripts\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.644254 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.644298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run-ovn\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.644345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-log-ovn\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.644372 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2qh\" (UniqueName: \"kubernetes.io/projected/92ef413c-ffc3-4e32-aedc-2d7248571527-kube-api-access-7c2qh\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.644410 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-scripts\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.745909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-log-ovn\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.745968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2qh\" (UniqueName: \"kubernetes.io/projected/92ef413c-ffc3-4e32-aedc-2d7248571527-kube-api-access-7c2qh\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746019 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-scripts\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-additional-scripts\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746089 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746128 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run-ovn\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746287 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run-ovn\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-log-ovn\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.746953 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-additional-scripts\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.750825 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-scripts\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.768112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2qh\" (UniqueName: \"kubernetes.io/projected/92ef413c-ffc3-4e32-aedc-2d7248571527-kube-api-access-7c2qh\") pod \"ovn-controller-666xn-config-5fxw6\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:58 crc kubenswrapper[4786]: I0313 12:08:58.860169 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:08:59 crc kubenswrapper[4786]: I0313 12:08:59.043812 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3"} Mar 13 12:08:59 crc kubenswrapper[4786]: I0313 12:08:59.309249 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666xn-config-5fxw6"] Mar 13 12:08:59 crc kubenswrapper[4786]: I0313 12:08:59.456612 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db469f94-ad60-4135-a65c-b54808df8307" path="/var/lib/kubelet/pods/db469f94-ad60-4135-a65c-b54808df8307/volumes" Mar 13 12:09:00 crc kubenswrapper[4786]: I0313 12:09:00.054825 4786 generic.go:334] "Generic (PLEG): container finished" podID="92ef413c-ffc3-4e32-aedc-2d7248571527" containerID="9234d8507c7d5b1040d1c4371fed627976b7224e46dea272c706f536bfab99b6" exitCode=0 Mar 13 12:09:00 crc kubenswrapper[4786]: I0313 12:09:00.054869 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-5fxw6" event={"ID":"92ef413c-ffc3-4e32-aedc-2d7248571527","Type":"ContainerDied","Data":"9234d8507c7d5b1040d1c4371fed627976b7224e46dea272c706f536bfab99b6"} Mar 13 12:09:00 crc kubenswrapper[4786]: I0313 12:09:00.055202 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-5fxw6" event={"ID":"92ef413c-ffc3-4e32-aedc-2d7248571527","Type":"ContainerStarted","Data":"0cdc2b9fbcebe288ae2c6847a5579a01b7f56e1ed8680ce1d88a0744cc5b77e7"} Mar 13 12:09:00 crc kubenswrapper[4786]: I0313 12:09:00.062770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839"} Mar 13 12:09:00 crc kubenswrapper[4786]: I0313 12:09:00.062813 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d"} Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.075274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f"} Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.434499 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603449 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-scripts\") pod \"92ef413c-ffc3-4e32-aedc-2d7248571527\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603796 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run-ovn\") pod \"92ef413c-ffc3-4e32-aedc-2d7248571527\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603840 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-additional-scripts\") pod \"92ef413c-ffc3-4e32-aedc-2d7248571527\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603894 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run\") pod \"92ef413c-ffc3-4e32-aedc-2d7248571527\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603954 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-log-ovn\") pod \"92ef413c-ffc3-4e32-aedc-2d7248571527\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "92ef413c-ffc3-4e32-aedc-2d7248571527" (UID: "92ef413c-ffc3-4e32-aedc-2d7248571527"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.603995 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run" (OuterVolumeSpecName: "var-run") pod "92ef413c-ffc3-4e32-aedc-2d7248571527" (UID: "92ef413c-ffc3-4e32-aedc-2d7248571527"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604000 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c2qh\" (UniqueName: \"kubernetes.io/projected/92ef413c-ffc3-4e32-aedc-2d7248571527-kube-api-access-7c2qh\") pod \"92ef413c-ffc3-4e32-aedc-2d7248571527\" (UID: \"92ef413c-ffc3-4e32-aedc-2d7248571527\") " Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604111 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "92ef413c-ffc3-4e32-aedc-2d7248571527" (UID: "92ef413c-ffc3-4e32-aedc-2d7248571527"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604524 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "92ef413c-ffc3-4e32-aedc-2d7248571527" (UID: "92ef413c-ffc3-4e32-aedc-2d7248571527"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604734 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-scripts" (OuterVolumeSpecName: "scripts") pod "92ef413c-ffc3-4e32-aedc-2d7248571527" (UID: "92ef413c-ffc3-4e32-aedc-2d7248571527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604748 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604762 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604773 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ef413c-ffc3-4e32-aedc-2d7248571527-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.604782 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.608215 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ef413c-ffc3-4e32-aedc-2d7248571527-kube-api-access-7c2qh" (OuterVolumeSpecName: "kube-api-access-7c2qh") pod "92ef413c-ffc3-4e32-aedc-2d7248571527" (UID: "92ef413c-ffc3-4e32-aedc-2d7248571527"). InnerVolumeSpecName "kube-api-access-7c2qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.706159 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c2qh\" (UniqueName: \"kubernetes.io/projected/92ef413c-ffc3-4e32-aedc-2d7248571527-kube-api-access-7c2qh\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:01 crc kubenswrapper[4786]: I0313 12:09:01.706186 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef413c-ffc3-4e32-aedc-2d7248571527-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.085371 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn-config-5fxw6" event={"ID":"92ef413c-ffc3-4e32-aedc-2d7248571527","Type":"ContainerDied","Data":"0cdc2b9fbcebe288ae2c6847a5579a01b7f56e1ed8680ce1d88a0744cc5b77e7"} Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.085413 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cdc2b9fbcebe288ae2c6847a5579a01b7f56e1ed8680ce1d88a0744cc5b77e7" Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.085415 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn-config-5fxw6" Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.089101 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3"} Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.089158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b"} Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.089174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885"} Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.089186 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c"} Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.520287 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-666xn-config-5fxw6"] Mar 13 12:09:02 crc kubenswrapper[4786]: I0313 12:09:02.526958 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-666xn-config-5fxw6"] Mar 13 12:09:03 crc kubenswrapper[4786]: I0313 12:09:03.456652 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ef413c-ffc3-4e32-aedc-2d7248571527" path="/var/lib/kubelet/pods/92ef413c-ffc3-4e32-aedc-2d7248571527/volumes" Mar 13 12:09:06 crc kubenswrapper[4786]: I0313 12:09:06.851114 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.130807 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-467fr"] Mar 13 12:09:07 crc kubenswrapper[4786]: E0313 12:09:07.131565 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef413c-ffc3-4e32-aedc-2d7248571527" containerName="ovn-config" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.131583 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef413c-ffc3-4e32-aedc-2d7248571527" containerName="ovn-config" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.131758 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ef413c-ffc3-4e32-aedc-2d7248571527" containerName="ovn-config" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.132427 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.154542 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-467fr"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.201971 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6lg6\" (UniqueName: \"kubernetes.io/projected/a5c2078a-f957-4d60-9a47-f7b0c7248b75-kube-api-access-r6lg6\") pod \"cinder-db-create-467fr\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.202069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c2078a-f957-4d60-9a47-f7b0c7248b75-operator-scripts\") pod \"cinder-db-create-467fr\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.233038 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5197-account-create-update-6g4d9"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.234870 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.239539 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.255197 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5197-account-create-update-6g4d9"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.303418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c2078a-f957-4d60-9a47-f7b0c7248b75-operator-scripts\") pod \"cinder-db-create-467fr\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.303501 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192abbc-1942-4e41-8e85-4416d725ac32-operator-scripts\") pod \"cinder-5197-account-create-update-6g4d9\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.303555 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv57g\" (UniqueName: \"kubernetes.io/projected/d192abbc-1942-4e41-8e85-4416d725ac32-kube-api-access-nv57g\") pod \"cinder-5197-account-create-update-6g4d9\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.303631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6lg6\" (UniqueName: \"kubernetes.io/projected/a5c2078a-f957-4d60-9a47-f7b0c7248b75-kube-api-access-r6lg6\") pod \"cinder-db-create-467fr\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.308029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c2078a-f957-4d60-9a47-f7b0c7248b75-operator-scripts\") pod \"cinder-db-create-467fr\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.329975 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6lg6\" (UniqueName: \"kubernetes.io/projected/a5c2078a-f957-4d60-9a47-f7b0c7248b75-kube-api-access-r6lg6\") pod \"cinder-db-create-467fr\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.343015 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.405348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192abbc-1942-4e41-8e85-4416d725ac32-operator-scripts\") pod \"cinder-5197-account-create-update-6g4d9\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.405418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv57g\" (UniqueName: \"kubernetes.io/projected/d192abbc-1942-4e41-8e85-4416d725ac32-kube-api-access-nv57g\") pod \"cinder-5197-account-create-update-6g4d9\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.408218 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192abbc-1942-4e41-8e85-4416d725ac32-operator-scripts\") pod \"cinder-5197-account-create-update-6g4d9\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.413463 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-lvhh7"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.414465 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.433753 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3078-account-create-update-n6pkt"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.435112 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.437853 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.448636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv57g\" (UniqueName: \"kubernetes.io/projected/d192abbc-1942-4e41-8e85-4416d725ac32-kube-api-access-nv57g\") pod \"cinder-5197-account-create-update-6g4d9\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.459227 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-467fr" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.473148 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-lvhh7"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.473183 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3078-account-create-update-n6pkt"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.506471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r2g7\" (UniqueName: \"kubernetes.io/projected/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-kube-api-access-6r2g7\") pod \"barbican-3078-account-create-update-n6pkt\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.506608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4cq5\" (UniqueName: \"kubernetes.io/projected/8909d231-1928-4f63-b383-856cb26fa4a2-kube-api-access-c4cq5\") pod \"barbican-db-create-lvhh7\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.506660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8909d231-1928-4f63-b383-856cb26fa4a2-operator-scripts\") pod \"barbican-db-create-lvhh7\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.506703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-operator-scripts\") pod \"barbican-3078-account-create-update-n6pkt\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.537262 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rj4n5"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.538603 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.549502 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.552312 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rj4n5"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.560601 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-da6e-account-create-update-56fnk"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.561591 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.564321 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.567242 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da6e-account-create-update-56fnk"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.607716 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4cq5\" (UniqueName: \"kubernetes.io/projected/8909d231-1928-4f63-b383-856cb26fa4a2-kube-api-access-c4cq5\") pod \"barbican-db-create-lvhh7\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.607765 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8909d231-1928-4f63-b383-856cb26fa4a2-operator-scripts\") pod \"barbican-db-create-lvhh7\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.607801 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af72684-e70e-4ff4-a72c-d4e830667645-operator-scripts\") pod \"neutron-db-create-rj4n5\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.607834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-operator-scripts\") pod \"barbican-3078-account-create-update-n6pkt\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.607872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m45mp\" (UniqueName: \"kubernetes.io/projected/3af72684-e70e-4ff4-a72c-d4e830667645-kube-api-access-m45mp\") pod \"neutron-db-create-rj4n5\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.607959 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r2g7\" (UniqueName: \"kubernetes.io/projected/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-kube-api-access-6r2g7\") pod \"barbican-3078-account-create-update-n6pkt\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.608771 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-operator-scripts\") pod \"barbican-3078-account-create-update-n6pkt\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.608792 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8909d231-1928-4f63-b383-856cb26fa4a2-operator-scripts\") pod \"barbican-db-create-lvhh7\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.634191 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4l225"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.639872 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.642915 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.643156 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.643195 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4l225"] Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.643308 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-66qfn" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.643396 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.651035 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r2g7\" (UniqueName: \"kubernetes.io/projected/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-kube-api-access-6r2g7\") pod \"barbican-3078-account-create-update-n6pkt\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.659271 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4cq5\" (UniqueName: \"kubernetes.io/projected/8909d231-1928-4f63-b383-856cb26fa4a2-kube-api-access-c4cq5\") pod \"barbican-db-create-lvhh7\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709545 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-operator-scripts\") pod \"neutron-da6e-account-create-update-56fnk\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709654 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af72684-e70e-4ff4-a72c-d4e830667645-operator-scripts\") pod \"neutron-db-create-rj4n5\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709704 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m45mp\" (UniqueName: \"kubernetes.io/projected/3af72684-e70e-4ff4-a72c-d4e830667645-kube-api-access-m45mp\") pod \"neutron-db-create-rj4n5\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzvg4\" (UniqueName: \"kubernetes.io/projected/ce773763-3741-4253-87c8-9726920b41dc-kube-api-access-zzvg4\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709759 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twnsn\" (UniqueName: \"kubernetes.io/projected/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-kube-api-access-twnsn\") pod \"neutron-da6e-account-create-update-56fnk\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709826 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-combined-ca-bundle\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.709896 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-config-data\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.710796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af72684-e70e-4ff4-a72c-d4e830667645-operator-scripts\") pod \"neutron-db-create-rj4n5\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.726810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m45mp\" (UniqueName: \"kubernetes.io/projected/3af72684-e70e-4ff4-a72c-d4e830667645-kube-api-access-m45mp\") pod \"neutron-db-create-rj4n5\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.733054 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.799935 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.813893 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-config-data\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.813948 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-operator-scripts\") pod \"neutron-da6e-account-create-update-56fnk\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.814027 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzvg4\" (UniqueName: \"kubernetes.io/projected/ce773763-3741-4253-87c8-9726920b41dc-kube-api-access-zzvg4\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.814073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twnsn\" (UniqueName: \"kubernetes.io/projected/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-kube-api-access-twnsn\") pod \"neutron-da6e-account-create-update-56fnk\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.814119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-combined-ca-bundle\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.816179 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-operator-scripts\") pod \"neutron-da6e-account-create-update-56fnk\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.821655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-combined-ca-bundle\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.833286 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-config-data\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.835329 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzvg4\" (UniqueName: \"kubernetes.io/projected/ce773763-3741-4253-87c8-9726920b41dc-kube-api-access-zzvg4\") pod \"keystone-db-sync-4l225\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.842056 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twnsn\" (UniqueName: \"kubernetes.io/projected/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-kube-api-access-twnsn\") pod \"neutron-da6e-account-create-update-56fnk\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.857694 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:07 crc kubenswrapper[4786]: I0313 12:09:07.878191 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:08 crc kubenswrapper[4786]: I0313 12:09:08.060948 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:11 crc kubenswrapper[4786]: I0313 12:09:11.387596 4786 scope.go:117] "RemoveContainer" containerID="3412e6946ef7f2fe51beeb053507012c5963bddc334090e08cf9dbf529ca8bb5" Mar 13 12:09:11 crc kubenswrapper[4786]: I0313 12:09:11.931977 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da6e-account-create-update-56fnk"] Mar 13 12:09:11 crc kubenswrapper[4786]: W0313 12:09:11.935966 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5dd74a7_14cd_4d77_95a3_0d8c98edb870.slice/crio-571e6f5b91b4e3a588a661955d4fe7068ed8f41e856a842943088c2869fe3091 WatchSource:0}: Error finding container 571e6f5b91b4e3a588a661955d4fe7068ed8f41e856a842943088c2869fe3091: Status 404 returned error can't find the container with id 571e6f5b91b4e3a588a661955d4fe7068ed8f41e856a842943088c2869fe3091 Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.058422 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4l225"] Mar 13 12:09:12 crc kubenswrapper[4786]: W0313 12:09:12.063520 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce773763_3741_4253_87c8_9726920b41dc.slice/crio-d739d97b41bf6a26ae9226f02a0cfcae724025883c938baaaff14eb2aeef1309 WatchSource:0}: Error finding container d739d97b41bf6a26ae9226f02a0cfcae724025883c938baaaff14eb2aeef1309: Status 404 returned error can't find the container with id d739d97b41bf6a26ae9226f02a0cfcae724025883c938baaaff14eb2aeef1309 Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.065616 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3078-account-create-update-n6pkt"] Mar 13 12:09:12 crc kubenswrapper[4786]: W0313 12:09:12.076063 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e6328ef_79d0_4db2_a172_1e2bbd1f8923.slice/crio-dce7fb51c758f4b2ef0ea04eadc75490f026350fdce1e0b14612f41bcbd765cf WatchSource:0}: Error finding container dce7fb51c758f4b2ef0ea04eadc75490f026350fdce1e0b14612f41bcbd765cf: Status 404 returned error can't find the container with id dce7fb51c758f4b2ef0ea04eadc75490f026350fdce1e0b14612f41bcbd765cf Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.157573 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-lvhh7"] Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.193970 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5197-account-create-update-6g4d9"] Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.207928 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rj4n5"] Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.215963 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-467fr"] Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.217511 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.217560 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.224566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4l225" event={"ID":"ce773763-3741-4253-87c8-9726920b41dc","Type":"ContainerStarted","Data":"d739d97b41bf6a26ae9226f02a0cfcae724025883c938baaaff14eb2aeef1309"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.230966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3078-account-create-update-n6pkt" event={"ID":"6e6328ef-79d0-4db2-a172-1e2bbd1f8923","Type":"ContainerStarted","Data":"dce7fb51c758f4b2ef0ea04eadc75490f026350fdce1e0b14612f41bcbd765cf"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.239235 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da6e-account-create-update-56fnk" event={"ID":"d5dd74a7-14cd-4d77-95a3-0d8c98edb870","Type":"ContainerStarted","Data":"0c8aae5fcc18920f5dd30bba03e2ecfbcd982311b38cc18ee5401f8207fe92e9"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.239278 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da6e-account-create-update-56fnk" event={"ID":"d5dd74a7-14cd-4d77-95a3-0d8c98edb870","Type":"ContainerStarted","Data":"571e6f5b91b4e3a588a661955d4fe7068ed8f41e856a842943088c2869fe3091"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.241526 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lvhh7" event={"ID":"8909d231-1928-4f63-b383-856cb26fa4a2","Type":"ContainerStarted","Data":"03cc58646b47e6deceaf00aa893267c9406804fc51845e6e05126a91e57da249"} Mar 13 12:09:12 crc kubenswrapper[4786]: I0313 12:09:12.283853 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-da6e-account-create-update-56fnk" podStartSLOduration=5.283828866 podStartE2EDuration="5.283828866s" podCreationTimestamp="2026-03-13 12:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:12.253995072 +0000 UTC m=+1339.533648549" watchObservedRunningTime="2026-03-13 12:09:12.283828866 +0000 UTC m=+1339.563482313" Mar 13 12:09:13 crc kubenswrapper[4786]: E0313 12:09:13.133297 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8909d231_1928_4f63_b383_856cb26fa4a2.slice/crio-conmon-c3d7a7117c0b8a182edf5a164208dda5d4e7f7b6662de2f2a8cf06ca833520b5.scope\": RecentStats: unable to find data in memory cache]" Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.252425 4786 generic.go:334] "Generic (PLEG): container finished" podID="3af72684-e70e-4ff4-a72c-d4e830667645" containerID="0af6194d99a56d00d89b6a59c543eeee81163c06a3dddb5e0f1108fc2b69ec6e" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.252507 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rj4n5" event={"ID":"3af72684-e70e-4ff4-a72c-d4e830667645","Type":"ContainerDied","Data":"0af6194d99a56d00d89b6a59c543eeee81163c06a3dddb5e0f1108fc2b69ec6e"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.252905 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rj4n5" event={"ID":"3af72684-e70e-4ff4-a72c-d4e830667645","Type":"ContainerStarted","Data":"9685946a323da845c02798b237e5c81e33ef1323e1a46cc65dbd8ba6b85d62b3"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.254924 4786 generic.go:334] "Generic (PLEG): container finished" podID="6e6328ef-79d0-4db2-a172-1e2bbd1f8923" containerID="1e10720c51b5e71372255bc026ee7d2a7ce0fc88ce17d5796e1f682a8d8fef6c" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.255163 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3078-account-create-update-n6pkt" event={"ID":"6e6328ef-79d0-4db2-a172-1e2bbd1f8923","Type":"ContainerDied","Data":"1e10720c51b5e71372255bc026ee7d2a7ce0fc88ce17d5796e1f682a8d8fef6c"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.261486 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5197-account-create-update-6g4d9" event={"ID":"d192abbc-1942-4e41-8e85-4416d725ac32","Type":"ContainerStarted","Data":"63d1c6a4ed628e0270bfaf6f6a59a54966e217bd3f5f6948011b4c84cc5c5d66"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.261539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5197-account-create-update-6g4d9" event={"ID":"d192abbc-1942-4e41-8e85-4416d725ac32","Type":"ContainerStarted","Data":"83b481f0257fcc47140b5d9505f5935c16453a10794ad318c4a1f5214c13ea44"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.263527 4786 generic.go:334] "Generic (PLEG): container finished" podID="8909d231-1928-4f63-b383-856cb26fa4a2" containerID="c3d7a7117c0b8a182edf5a164208dda5d4e7f7b6662de2f2a8cf06ca833520b5" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.263593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lvhh7" event={"ID":"8909d231-1928-4f63-b383-856cb26fa4a2","Type":"ContainerDied","Data":"c3d7a7117c0b8a182edf5a164208dda5d4e7f7b6662de2f2a8cf06ca833520b5"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.274534 4786 generic.go:334] "Generic (PLEG): container finished" podID="d5dd74a7-14cd-4d77-95a3-0d8c98edb870" containerID="0c8aae5fcc18920f5dd30bba03e2ecfbcd982311b38cc18ee5401f8207fe92e9" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.274696 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da6e-account-create-update-56fnk" event={"ID":"d5dd74a7-14cd-4d77-95a3-0d8c98edb870","Type":"ContainerDied","Data":"0c8aae5fcc18920f5dd30bba03e2ecfbcd982311b38cc18ee5401f8207fe92e9"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.277431 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8shl7" event={"ID":"a594fa40-6352-480d-8927-c04bf51c9c51","Type":"ContainerStarted","Data":"e2c019ff0348a178bded3382ef62270a929b3de44dbd0995c92698df2bef3289"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.279963 4786 generic.go:334] "Generic (PLEG): container finished" podID="a5c2078a-f957-4d60-9a47-f7b0c7248b75" containerID="c5e3569be852ead07be105446ea4c198eaab3173139bc6c84679868abfdda561" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.280044 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-467fr" event={"ID":"a5c2078a-f957-4d60-9a47-f7b0c7248b75","Type":"ContainerDied","Data":"c5e3569be852ead07be105446ea4c198eaab3173139bc6c84679868abfdda561"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.280066 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-467fr" event={"ID":"a5c2078a-f957-4d60-9a47-f7b0c7248b75","Type":"ContainerStarted","Data":"250f17e4662b96cb554e33a729eeee6d5737872879ce3370e3f8253bb5a9d693"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.297271 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.297310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f"} Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.317575 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-5197-account-create-update-6g4d9" podStartSLOduration=6.3175585 podStartE2EDuration="6.3175585s" podCreationTimestamp="2026-03-13 12:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:13.310242655 +0000 UTC m=+1340.589896102" watchObservedRunningTime="2026-03-13 12:09:13.3175585 +0000 UTC m=+1340.597211947" Mar 13 12:09:13 crc kubenswrapper[4786]: I0313 12:09:13.334632 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8shl7" podStartSLOduration=3.660287762 podStartE2EDuration="18.334614435s" podCreationTimestamp="2026-03-13 12:08:55 +0000 UTC" firstStartedPulling="2026-03-13 12:08:56.82472283 +0000 UTC m=+1324.104376277" lastFinishedPulling="2026-03-13 12:09:11.499049503 +0000 UTC m=+1338.778702950" observedRunningTime="2026-03-13 12:09:13.323709324 +0000 UTC m=+1340.603362781" watchObservedRunningTime="2026-03-13 12:09:13.334614435 +0000 UTC m=+1340.614267882" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.324325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b"} Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.327749 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640"} Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.328193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerStarted","Data":"3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f"} Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.343063 4786 generic.go:334] "Generic (PLEG): container finished" podID="d192abbc-1942-4e41-8e85-4416d725ac32" containerID="63d1c6a4ed628e0270bfaf6f6a59a54966e217bd3f5f6948011b4c84cc5c5d66" exitCode=0 Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.343322 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5197-account-create-update-6g4d9" event={"ID":"d192abbc-1942-4e41-8e85-4416d725ac32","Type":"ContainerDied","Data":"63d1c6a4ed628e0270bfaf6f6a59a54966e217bd3f5f6948011b4c84cc5c5d66"} Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.377708 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.567040535 podStartE2EDuration="52.377682158s" podCreationTimestamp="2026-03-13 12:08:22 +0000 UTC" firstStartedPulling="2026-03-13 12:08:56.688184253 +0000 UTC m=+1323.967837700" lastFinishedPulling="2026-03-13 12:09:11.498825876 +0000 UTC m=+1338.778479323" observedRunningTime="2026-03-13 12:09:14.364521787 +0000 UTC m=+1341.644175234" watchObservedRunningTime="2026-03-13 12:09:14.377682158 +0000 UTC m=+1341.657335605" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.646392 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c8b998f77-rn5ds"] Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.648134 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.649823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.649945 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-swift-storage-0\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.650010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-svc\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.650125 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.650164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntmc\" (UniqueName: \"kubernetes.io/projected/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-kube-api-access-jntmc\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.650228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-config\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.652011 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.660519 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8b998f77-rn5ds"] Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.752405 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.752465 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jntmc\" (UniqueName: \"kubernetes.io/projected/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-kube-api-access-jntmc\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.752517 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-config\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.752547 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.752594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-swift-storage-0\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.752656 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-svc\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.753694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-svc\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.754633 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.755335 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.757981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-swift-storage-0\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.758048 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-config\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.793039 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntmc\" (UniqueName: \"kubernetes.io/projected/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-kube-api-access-jntmc\") pod \"dnsmasq-dns-7c8b998f77-rn5ds\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:14 crc kubenswrapper[4786]: I0313 12:09:14.968094 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.737842 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-467fr" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.762242 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.766575 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.775409 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.785051 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.813640 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192abbc-1942-4e41-8e85-4416d725ac32-operator-scripts\") pod \"d192abbc-1942-4e41-8e85-4416d725ac32\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885409 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4cq5\" (UniqueName: \"kubernetes.io/projected/8909d231-1928-4f63-b383-856cb26fa4a2-kube-api-access-c4cq5\") pod \"8909d231-1928-4f63-b383-856cb26fa4a2\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af72684-e70e-4ff4-a72c-d4e830667645-operator-scripts\") pod \"3af72684-e70e-4ff4-a72c-d4e830667645\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885599 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8909d231-1928-4f63-b383-856cb26fa4a2-operator-scripts\") pod \"8909d231-1928-4f63-b383-856cb26fa4a2\" (UID: \"8909d231-1928-4f63-b383-856cb26fa4a2\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885625 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c2078a-f957-4d60-9a47-f7b0c7248b75-operator-scripts\") pod \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885649 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twnsn\" (UniqueName: \"kubernetes.io/projected/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-kube-api-access-twnsn\") pod \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885688 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv57g\" (UniqueName: \"kubernetes.io/projected/d192abbc-1942-4e41-8e85-4416d725ac32-kube-api-access-nv57g\") pod \"d192abbc-1942-4e41-8e85-4416d725ac32\" (UID: \"d192abbc-1942-4e41-8e85-4416d725ac32\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885716 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m45mp\" (UniqueName: \"kubernetes.io/projected/3af72684-e70e-4ff4-a72c-d4e830667645-kube-api-access-m45mp\") pod \"3af72684-e70e-4ff4-a72c-d4e830667645\" (UID: \"3af72684-e70e-4ff4-a72c-d4e830667645\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885762 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6lg6\" (UniqueName: \"kubernetes.io/projected/a5c2078a-f957-4d60-9a47-f7b0c7248b75-kube-api-access-r6lg6\") pod \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\" (UID: \"a5c2078a-f957-4d60-9a47-f7b0c7248b75\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.885800 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-operator-scripts\") pod \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\" (UID: \"d5dd74a7-14cd-4d77-95a3-0d8c98edb870\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.892003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d192abbc-1942-4e41-8e85-4416d725ac32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d192abbc-1942-4e41-8e85-4416d725ac32" (UID: "d192abbc-1942-4e41-8e85-4416d725ac32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.892415 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af72684-e70e-4ff4-a72c-d4e830667645-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3af72684-e70e-4ff4-a72c-d4e830667645" (UID: "3af72684-e70e-4ff4-a72c-d4e830667645"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.892746 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8909d231-1928-4f63-b383-856cb26fa4a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8909d231-1928-4f63-b383-856cb26fa4a2" (UID: "8909d231-1928-4f63-b383-856cb26fa4a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.893155 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c2078a-f957-4d60-9a47-f7b0c7248b75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5c2078a-f957-4d60-9a47-f7b0c7248b75" (UID: "a5c2078a-f957-4d60-9a47-f7b0c7248b75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.892014 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5dd74a7-14cd-4d77-95a3-0d8c98edb870" (UID: "d5dd74a7-14cd-4d77-95a3-0d8c98edb870"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.906622 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8909d231-1928-4f63-b383-856cb26fa4a2-kube-api-access-c4cq5" (OuterVolumeSpecName: "kube-api-access-c4cq5") pod "8909d231-1928-4f63-b383-856cb26fa4a2" (UID: "8909d231-1928-4f63-b383-856cb26fa4a2"). InnerVolumeSpecName "kube-api-access-c4cq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.909122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-kube-api-access-twnsn" (OuterVolumeSpecName: "kube-api-access-twnsn") pod "d5dd74a7-14cd-4d77-95a3-0d8c98edb870" (UID: "d5dd74a7-14cd-4d77-95a3-0d8c98edb870"). InnerVolumeSpecName "kube-api-access-twnsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.909531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af72684-e70e-4ff4-a72c-d4e830667645-kube-api-access-m45mp" (OuterVolumeSpecName: "kube-api-access-m45mp") pod "3af72684-e70e-4ff4-a72c-d4e830667645" (UID: "3af72684-e70e-4ff4-a72c-d4e830667645"). InnerVolumeSpecName "kube-api-access-m45mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.909762 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c2078a-f957-4d60-9a47-f7b0c7248b75-kube-api-access-r6lg6" (OuterVolumeSpecName: "kube-api-access-r6lg6") pod "a5c2078a-f957-4d60-9a47-f7b0c7248b75" (UID: "a5c2078a-f957-4d60-9a47-f7b0c7248b75"). InnerVolumeSpecName "kube-api-access-r6lg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.912039 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d192abbc-1942-4e41-8e85-4416d725ac32-kube-api-access-nv57g" (OuterVolumeSpecName: "kube-api-access-nv57g") pod "d192abbc-1942-4e41-8e85-4416d725ac32" (UID: "d192abbc-1942-4e41-8e85-4416d725ac32"). InnerVolumeSpecName "kube-api-access-nv57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.992866 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-operator-scripts\") pod \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993054 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r2g7\" (UniqueName: \"kubernetes.io/projected/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-kube-api-access-6r2g7\") pod \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\" (UID: \"6e6328ef-79d0-4db2-a172-1e2bbd1f8923\") " Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993431 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8909d231-1928-4f63-b383-856cb26fa4a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993450 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c2078a-f957-4d60-9a47-f7b0c7248b75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993460 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twnsn\" (UniqueName: \"kubernetes.io/projected/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-kube-api-access-twnsn\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993471 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv57g\" (UniqueName: \"kubernetes.io/projected/d192abbc-1942-4e41-8e85-4416d725ac32-kube-api-access-nv57g\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993482 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m45mp\" (UniqueName: \"kubernetes.io/projected/3af72684-e70e-4ff4-a72c-d4e830667645-kube-api-access-m45mp\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993493 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6lg6\" (UniqueName: \"kubernetes.io/projected/a5c2078a-f957-4d60-9a47-f7b0c7248b75-kube-api-access-r6lg6\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993506 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dd74a7-14cd-4d77-95a3-0d8c98edb870-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993517 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192abbc-1942-4e41-8e85-4416d725ac32-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993525 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4cq5\" (UniqueName: \"kubernetes.io/projected/8909d231-1928-4f63-b383-856cb26fa4a2-kube-api-access-c4cq5\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.993533 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af72684-e70e-4ff4-a72c-d4e830667645-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.994080 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e6328ef-79d0-4db2-a172-1e2bbd1f8923" (UID: "6e6328ef-79d0-4db2-a172-1e2bbd1f8923"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:16 crc kubenswrapper[4786]: I0313 12:09:16.996989 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-kube-api-access-6r2g7" (OuterVolumeSpecName: "kube-api-access-6r2g7") pod "6e6328ef-79d0-4db2-a172-1e2bbd1f8923" (UID: "6e6328ef-79d0-4db2-a172-1e2bbd1f8923"). InnerVolumeSpecName "kube-api-access-6r2g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.095206 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r2g7\" (UniqueName: \"kubernetes.io/projected/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-kube-api-access-6r2g7\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.095251 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e6328ef-79d0-4db2-a172-1e2bbd1f8923-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.141438 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8b998f77-rn5ds"] Mar 13 12:09:17 crc kubenswrapper[4786]: W0313 12:09:17.143954 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4188daf_2fcb_49ae_b0af_ac1e6b6b6fbc.slice/crio-1085305dc9c9253464712745558eccf5cc77974eb80442e31884291fd12f5a29 WatchSource:0}: Error finding container 1085305dc9c9253464712745558eccf5cc77974eb80442e31884291fd12f5a29: Status 404 returned error can't find the container with id 1085305dc9c9253464712745558eccf5cc77974eb80442e31884291fd12f5a29 Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.376069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-467fr" event={"ID":"a5c2078a-f957-4d60-9a47-f7b0c7248b75","Type":"ContainerDied","Data":"250f17e4662b96cb554e33a729eeee6d5737872879ce3370e3f8253bb5a9d693"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.376114 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250f17e4662b96cb554e33a729eeee6d5737872879ce3370e3f8253bb5a9d693" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.376197 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-467fr" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.385237 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rj4n5" event={"ID":"3af72684-e70e-4ff4-a72c-d4e830667645","Type":"ContainerDied","Data":"9685946a323da845c02798b237e5c81e33ef1323e1a46cc65dbd8ba6b85d62b3"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.385260 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rj4n5" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.385270 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9685946a323da845c02798b237e5c81e33ef1323e1a46cc65dbd8ba6b85d62b3" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.390338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3078-account-create-update-n6pkt" event={"ID":"6e6328ef-79d0-4db2-a172-1e2bbd1f8923","Type":"ContainerDied","Data":"dce7fb51c758f4b2ef0ea04eadc75490f026350fdce1e0b14612f41bcbd765cf"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.390368 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce7fb51c758f4b2ef0ea04eadc75490f026350fdce1e0b14612f41bcbd765cf" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.390371 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-n6pkt" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.391722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5197-account-create-update-6g4d9" event={"ID":"d192abbc-1942-4e41-8e85-4416d725ac32","Type":"ContainerDied","Data":"83b481f0257fcc47140b5d9505f5935c16453a10794ad318c4a1f5214c13ea44"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.391747 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b481f0257fcc47140b5d9505f5935c16453a10794ad318c4a1f5214c13ea44" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.391800 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5197-account-create-update-6g4d9" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.394888 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" event={"ID":"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc","Type":"ContainerStarted","Data":"1085305dc9c9253464712745558eccf5cc77974eb80442e31884291fd12f5a29"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.397091 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lvhh7" event={"ID":"8909d231-1928-4f63-b383-856cb26fa4a2","Type":"ContainerDied","Data":"03cc58646b47e6deceaf00aa893267c9406804fc51845e6e05126a91e57da249"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.397126 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03cc58646b47e6deceaf00aa893267c9406804fc51845e6e05126a91e57da249" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.397185 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lvhh7" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.408426 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4l225" event={"ID":"ce773763-3741-4253-87c8-9726920b41dc","Type":"ContainerStarted","Data":"87c9337e5f8be7831921a2cc00598ccb8555b8faf4b6f237f998d7e8ccce7644"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.411558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da6e-account-create-update-56fnk" event={"ID":"d5dd74a7-14cd-4d77-95a3-0d8c98edb870","Type":"ContainerDied","Data":"571e6f5b91b4e3a588a661955d4fe7068ed8f41e856a842943088c2869fe3091"} Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.411592 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="571e6f5b91b4e3a588a661955d4fe7068ed8f41e856a842943088c2869fe3091" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.411756 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da6e-account-create-update-56fnk" Mar 13 12:09:17 crc kubenswrapper[4786]: I0313 12:09:17.429014 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4l225" podStartSLOduration=5.668841553 podStartE2EDuration="10.428997133s" podCreationTimestamp="2026-03-13 12:09:07 +0000 UTC" firstStartedPulling="2026-03-13 12:09:12.066274102 +0000 UTC m=+1339.345927559" lastFinishedPulling="2026-03-13 12:09:16.826429692 +0000 UTC m=+1344.106083139" observedRunningTime="2026-03-13 12:09:17.428196921 +0000 UTC m=+1344.707850378" watchObservedRunningTime="2026-03-13 12:09:17.428997133 +0000 UTC m=+1344.708650570" Mar 13 12:09:18 crc kubenswrapper[4786]: I0313 12:09:18.420552 4786 generic.go:334] "Generic (PLEG): container finished" podID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerID="d39ea3a1da6f99cbb62fe791878f5ef9cce1d109815f5a80560e00ab50025aca" exitCode=0 Mar 13 12:09:18 crc kubenswrapper[4786]: I0313 12:09:18.422123 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" event={"ID":"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc","Type":"ContainerDied","Data":"d39ea3a1da6f99cbb62fe791878f5ef9cce1d109815f5a80560e00ab50025aca"} Mar 13 12:09:19 crc kubenswrapper[4786]: I0313 12:09:19.433129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" event={"ID":"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc","Type":"ContainerStarted","Data":"25fafa71eab8767caed9369e6ea23c868d0fcaca877ac8f3111397e078cc1a60"} Mar 13 12:09:27 crc kubenswrapper[4786]: I0313 12:09:27.513244 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:27 crc kubenswrapper[4786]: I0313 12:09:27.515628 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:09:27 crc kubenswrapper[4786]: I0313 12:09:27.545431 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" podStartSLOduration=13.545400561 podStartE2EDuration="13.545400561s" podCreationTimestamp="2026-03-13 12:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:27.535787134 +0000 UTC m=+1354.815440611" watchObservedRunningTime="2026-03-13 12:09:27.545400561 +0000 UTC m=+1354.825054088" Mar 13 12:09:27 crc kubenswrapper[4786]: I0313 12:09:27.603514 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-f5l28"] Mar 13 12:09:27 crc kubenswrapper[4786]: I0313 12:09:27.603850 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="dnsmasq-dns" containerID="cri-o://edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b" gracePeriod=10 Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.141571 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.183561 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-nb\") pod \"5369266d-f8f8-4667-a8dd-0f316e959fc0\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.183784 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-dns-svc\") pod \"5369266d-f8f8-4667-a8dd-0f316e959fc0\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.183966 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpx7k\" (UniqueName: \"kubernetes.io/projected/5369266d-f8f8-4667-a8dd-0f316e959fc0-kube-api-access-cpx7k\") pod \"5369266d-f8f8-4667-a8dd-0f316e959fc0\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.184105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-sb\") pod \"5369266d-f8f8-4667-a8dd-0f316e959fc0\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.184234 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-config\") pod \"5369266d-f8f8-4667-a8dd-0f316e959fc0\" (UID: \"5369266d-f8f8-4667-a8dd-0f316e959fc0\") " Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.197242 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5369266d-f8f8-4667-a8dd-0f316e959fc0-kube-api-access-cpx7k" (OuterVolumeSpecName: "kube-api-access-cpx7k") pod "5369266d-f8f8-4667-a8dd-0f316e959fc0" (UID: "5369266d-f8f8-4667-a8dd-0f316e959fc0"). InnerVolumeSpecName "kube-api-access-cpx7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.234053 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5369266d-f8f8-4667-a8dd-0f316e959fc0" (UID: "5369266d-f8f8-4667-a8dd-0f316e959fc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.256357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5369266d-f8f8-4667-a8dd-0f316e959fc0" (UID: "5369266d-f8f8-4667-a8dd-0f316e959fc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.263097 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5369266d-f8f8-4667-a8dd-0f316e959fc0" (UID: "5369266d-f8f8-4667-a8dd-0f316e959fc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.286927 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.286961 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.286972 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpx7k\" (UniqueName: \"kubernetes.io/projected/5369266d-f8f8-4667-a8dd-0f316e959fc0-kube-api-access-cpx7k\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.286981 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.287820 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-config" (OuterVolumeSpecName: "config") pod "5369266d-f8f8-4667-a8dd-0f316e959fc0" (UID: "5369266d-f8f8-4667-a8dd-0f316e959fc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.388362 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5369266d-f8f8-4667-a8dd-0f316e959fc0-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.523322 4786 generic.go:334] "Generic (PLEG): container finished" podID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerID="edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b" exitCode=0 Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.523386 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.523447 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" event={"ID":"5369266d-f8f8-4667-a8dd-0f316e959fc0","Type":"ContainerDied","Data":"edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b"} Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.523484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" event={"ID":"5369266d-f8f8-4667-a8dd-0f316e959fc0","Type":"ContainerDied","Data":"2ecf1a8cb1d55cf9fbf520fc70ccfda3d8c91d42d13c46d079cc7a3973484a5e"} Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.523508 4786 scope.go:117] "RemoveContainer" containerID="edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.555276 4786 scope.go:117] "RemoveContainer" containerID="76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.569580 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-f5l28"] Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.586439 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-f5l28"] Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.589095 4786 scope.go:117] "RemoveContainer" containerID="edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b" Mar 13 12:09:28 crc kubenswrapper[4786]: E0313 12:09:28.589548 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b\": container with ID starting with edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b not found: ID does not exist" containerID="edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.589577 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b"} err="failed to get container status \"edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b\": rpc error: code = NotFound desc = could not find container \"edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b\": container with ID starting with edf379e3a18011ffd0f00e1760565ba9ef46de8932538110299d4f135b5a0b6b not found: ID does not exist" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.589610 4786 scope.go:117] "RemoveContainer" containerID="76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542" Mar 13 12:09:28 crc kubenswrapper[4786]: E0313 12:09:28.590092 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542\": container with ID starting with 76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542 not found: ID does not exist" containerID="76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542" Mar 13 12:09:28 crc kubenswrapper[4786]: I0313 12:09:28.590144 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542"} err="failed to get container status \"76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542\": rpc error: code = NotFound desc = could not find container \"76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542\": container with ID starting with 76f219285f070af854eb0c6ba95d105e3ed3e3b95fe90a22d5bdd0b540833542 not found: ID does not exist" Mar 13 12:09:29 crc kubenswrapper[4786]: I0313 12:09:29.455407 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" path="/var/lib/kubelet/pods/5369266d-f8f8-4667-a8dd-0f316e959fc0/volumes" Mar 13 12:09:33 crc kubenswrapper[4786]: I0313 12:09:33.069245 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b9fd7d84c-f5l28" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Mar 13 12:09:34 crc kubenswrapper[4786]: I0313 12:09:34.574048 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce773763-3741-4253-87c8-9726920b41dc" containerID="87c9337e5f8be7831921a2cc00598ccb8555b8faf4b6f237f998d7e8ccce7644" exitCode=0 Mar 13 12:09:34 crc kubenswrapper[4786]: I0313 12:09:34.574095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4l225" event={"ID":"ce773763-3741-4253-87c8-9726920b41dc","Type":"ContainerDied","Data":"87c9337e5f8be7831921a2cc00598ccb8555b8faf4b6f237f998d7e8ccce7644"} Mar 13 12:09:35 crc kubenswrapper[4786]: I0313 12:09:35.902672 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.017988 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-combined-ca-bundle\") pod \"ce773763-3741-4253-87c8-9726920b41dc\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.018771 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-config-data\") pod \"ce773763-3741-4253-87c8-9726920b41dc\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.018938 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzvg4\" (UniqueName: \"kubernetes.io/projected/ce773763-3741-4253-87c8-9726920b41dc-kube-api-access-zzvg4\") pod \"ce773763-3741-4253-87c8-9726920b41dc\" (UID: \"ce773763-3741-4253-87c8-9726920b41dc\") " Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.026103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce773763-3741-4253-87c8-9726920b41dc-kube-api-access-zzvg4" (OuterVolumeSpecName: "kube-api-access-zzvg4") pod "ce773763-3741-4253-87c8-9726920b41dc" (UID: "ce773763-3741-4253-87c8-9726920b41dc"). InnerVolumeSpecName "kube-api-access-zzvg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.046601 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce773763-3741-4253-87c8-9726920b41dc" (UID: "ce773763-3741-4253-87c8-9726920b41dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.085455 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-config-data" (OuterVolumeSpecName: "config-data") pod "ce773763-3741-4253-87c8-9726920b41dc" (UID: "ce773763-3741-4253-87c8-9726920b41dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.122243 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.122285 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce773763-3741-4253-87c8-9726920b41dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.122299 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzvg4\" (UniqueName: \"kubernetes.io/projected/ce773763-3741-4253-87c8-9726920b41dc-kube-api-access-zzvg4\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.591869 4786 generic.go:334] "Generic (PLEG): container finished" podID="a594fa40-6352-480d-8927-c04bf51c9c51" containerID="e2c019ff0348a178bded3382ef62270a929b3de44dbd0995c92698df2bef3289" exitCode=0 Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.591955 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8shl7" event={"ID":"a594fa40-6352-480d-8927-c04bf51c9c51","Type":"ContainerDied","Data":"e2c019ff0348a178bded3382ef62270a929b3de44dbd0995c92698df2bef3289"} Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.593873 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4l225" event={"ID":"ce773763-3741-4253-87c8-9726920b41dc","Type":"ContainerDied","Data":"d739d97b41bf6a26ae9226f02a0cfcae724025883c938baaaff14eb2aeef1309"} Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.593922 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d739d97b41bf6a26ae9226f02a0cfcae724025883c938baaaff14eb2aeef1309" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.593981 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4l225" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.892482 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bzltb"] Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893248 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d192abbc-1942-4e41-8e85-4416d725ac32" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893270 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d192abbc-1942-4e41-8e85-4416d725ac32" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893293 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6328ef-79d0-4db2-a172-1e2bbd1f8923" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893302 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6328ef-79d0-4db2-a172-1e2bbd1f8923" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893316 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8909d231-1928-4f63-b383-856cb26fa4a2" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893324 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8909d231-1928-4f63-b383-856cb26fa4a2" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893335 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c2078a-f957-4d60-9a47-f7b0c7248b75" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893343 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c2078a-f957-4d60-9a47-f7b0c7248b75" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893357 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce773763-3741-4253-87c8-9726920b41dc" containerName="keystone-db-sync" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893364 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce773763-3741-4253-87c8-9726920b41dc" containerName="keystone-db-sync" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893381 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af72684-e70e-4ff4-a72c-d4e830667645" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893388 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af72684-e70e-4ff4-a72c-d4e830667645" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893403 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="dnsmasq-dns" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893410 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="dnsmasq-dns" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893430 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="init" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893437 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="init" Mar 13 12:09:36 crc kubenswrapper[4786]: E0313 12:09:36.893455 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dd74a7-14cd-4d77-95a3-0d8c98edb870" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893464 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dd74a7-14cd-4d77-95a3-0d8c98edb870" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893654 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d192abbc-1942-4e41-8e85-4416d725ac32" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893715 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af72684-e70e-4ff4-a72c-d4e830667645" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893729 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5369266d-f8f8-4667-a8dd-0f316e959fc0" containerName="dnsmasq-dns" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893741 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6328ef-79d0-4db2-a172-1e2bbd1f8923" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893753 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce773763-3741-4253-87c8-9726920b41dc" containerName="keystone-db-sync" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893762 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dd74a7-14cd-4d77-95a3-0d8c98edb870" containerName="mariadb-account-create-update" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893770 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c2078a-f957-4d60-9a47-f7b0c7248b75" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.893782 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8909d231-1928-4f63-b383-856cb26fa4a2" containerName="mariadb-database-create" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.894501 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.900013 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.900261 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.900505 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.900718 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-66qfn" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.900854 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.901973 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-789bb6d89c-l5c8s"] Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.903588 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.918945 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bzltb"] Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.936733 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-credential-keys\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.936757 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-789bb6d89c-l5c8s"] Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.936797 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-config-data\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.936818 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nppt\" (UniqueName: \"kubernetes.io/projected/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-kube-api-access-7nppt\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.936855 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-combined-ca-bundle\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.936919 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-fernet-keys\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:36 crc kubenswrapper[4786]: I0313 12:09:36.937002 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-scripts\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038552 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59c97\" (UniqueName: \"kubernetes.io/projected/ce89ba18-706e-460f-97db-35a3d5008268-kube-api-access-59c97\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038614 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-config-data\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038636 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nppt\" (UniqueName: \"kubernetes.io/projected/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-kube-api-access-7nppt\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-combined-ca-bundle\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038674 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-svc\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-fernet-keys\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038719 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-sb\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038751 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-swift-storage-0\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-scripts\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038801 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-nb\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-config\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.038860 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-credential-keys\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.042803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-fernet-keys\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.043064 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-combined-ca-bundle\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.043269 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-scripts\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.045420 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-config-data\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.048413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-credential-keys\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.086360 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nppt\" (UniqueName: \"kubernetes.io/projected/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-kube-api-access-7nppt\") pod \"keystone-bootstrap-bzltb\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.090617 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-59gr5"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.091775 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.106169 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8pp56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.106420 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-59gr5"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.107678 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.107824 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.124704 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-prqp8"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.131424 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.136978 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.137697 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.137967 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mnmqw" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140364 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-config\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59c97\" (UniqueName: \"kubernetes.io/projected/ce89ba18-706e-460f-97db-35a3d5008268-kube-api-access-59c97\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140450 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-svc\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140481 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-sb\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-scripts\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140529 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-combined-ca-bundle\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140548 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghrg5\" (UniqueName: \"kubernetes.io/projected/6d0fe660-4646-4b25-b5b6-b24989d78be4-kube-api-access-ghrg5\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-swift-storage-0\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140598 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d0fe660-4646-4b25-b5b6-b24989d78be4-etc-machine-id\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140620 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-nb\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140648 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-db-sync-config-data\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.140664 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-config-data\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.141495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-config\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.141951 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-svc\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.142090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-swift-storage-0\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.142589 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-sb\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.143190 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-nb\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.155710 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-prqp8"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.213996 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59c97\" (UniqueName: \"kubernetes.io/projected/ce89ba18-706e-460f-97db-35a3d5008268-kube-api-access-59c97\") pod \"dnsmasq-dns-789bb6d89c-l5c8s\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.221001 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.230018 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.232332 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.234817 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.242229 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.242487 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243286 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qp5\" (UniqueName: \"kubernetes.io/projected/07961f7a-7824-4e7d-b30a-e47699b2ca0f-kube-api-access-r8qp5\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243415 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-config\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243533 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-scripts\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243633 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-combined-ca-bundle\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243723 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghrg5\" (UniqueName: \"kubernetes.io/projected/6d0fe660-4646-4b25-b5b6-b24989d78be4-kube-api-access-ghrg5\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243838 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d0fe660-4646-4b25-b5b6-b24989d78be4-etc-machine-id\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.243962 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-combined-ca-bundle\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.244079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-db-sync-config-data\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.244171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-config-data\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.249672 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-combined-ca-bundle\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.251431 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d0fe660-4646-4b25-b5b6-b24989d78be4-etc-machine-id\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.268409 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-db-sync-config-data\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.269643 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-config-data\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.278617 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-scripts\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.288585 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghrg5\" (UniqueName: \"kubernetes.io/projected/6d0fe660-4646-4b25-b5b6-b24989d78be4-kube-api-access-ghrg5\") pod \"cinder-db-sync-59gr5\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.316605 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-x5ns6"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.323952 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.328968 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.329162 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7jf6w" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.329269 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.345437 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-run-httpd\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.345864 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jw8\" (UniqueName: \"kubernetes.io/projected/da5e2292-690b-4774-833c-5823cfb8f6ca-kube-api-access-p2jw8\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346033 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-log-httpd\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-config-data\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346339 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qp5\" (UniqueName: \"kubernetes.io/projected/07961f7a-7824-4e7d-b30a-e47699b2ca0f-kube-api-access-r8qp5\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346636 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-config\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346757 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-scripts\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.346937 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.347139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-combined-ca-bundle\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.350409 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.359175 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-combined-ca-bundle\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.359259 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-config\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.386447 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789bb6d89c-l5c8s"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.402510 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qp5\" (UniqueName: \"kubernetes.io/projected/07961f7a-7824-4e7d-b30a-e47699b2ca0f-kube-api-access-r8qp5\") pod \"neutron-db-sync-prqp8\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.405905 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x5ns6"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.415377 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5rb8r"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.416662 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.419201 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.419681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rz4kw" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.449705 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jw8\" (UniqueName: \"kubernetes.io/projected/da5e2292-690b-4774-833c-5823cfb8f6ca-kube-api-access-p2jw8\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.449858 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-db-sync-config-data\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.449982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-log-httpd\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-config-data\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-config-data\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450358 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-scripts\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450542 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-scripts\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-combined-ca-bundle\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450799 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnx6\" (UniqueName: \"kubernetes.io/projected/d69f3ce2-7166-46f3-8381-987837e3383e-kube-api-access-jxnx6\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.450899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklb7\" (UniqueName: \"kubernetes.io/projected/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-kube-api-access-qklb7\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.451003 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69f3ce2-7166-46f3-8381-987837e3383e-logs\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.451110 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-run-httpd\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.451191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-combined-ca-bundle\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.452016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-log-httpd\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.455302 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-run-httpd\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.456625 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-config-data\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.459558 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.470838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-scripts\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.471708 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.475983 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79d475cf97-tzr56"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.477713 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.483907 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-59gr5" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.495683 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5rb8r"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.496345 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prqp8" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.519580 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jw8\" (UniqueName: \"kubernetes.io/projected/da5e2292-690b-4774-833c-5823cfb8f6ca-kube-api-access-p2jw8\") pod \"ceilometer-0\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555473 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69f3ce2-7166-46f3-8381-987837e3383e-logs\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555607 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-combined-ca-bundle\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555664 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-db-sync-config-data\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555715 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-config-data\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555753 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-config\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-swift-storage-0\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.555916 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-sb\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.556076 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-scripts\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.556257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-combined-ca-bundle\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.556300 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdwqh\" (UniqueName: \"kubernetes.io/projected/f49ffdce-d078-4718-95a4-df84a9d7abda-kube-api-access-cdwqh\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.556333 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnx6\" (UniqueName: \"kubernetes.io/projected/d69f3ce2-7166-46f3-8381-987837e3383e-kube-api-access-jxnx6\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.556367 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-nb\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.560339 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklb7\" (UniqueName: \"kubernetes.io/projected/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-kube-api-access-qklb7\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.561579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69f3ce2-7166-46f3-8381-987837e3383e-logs\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.565904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-combined-ca-bundle\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.571508 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-db-sync-config-data\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.572132 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-config-data\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.576838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-combined-ca-bundle\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.578573 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-scripts\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.582645 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d475cf97-tzr56"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.584744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklb7\" (UniqueName: \"kubernetes.io/projected/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-kube-api-access-qklb7\") pod \"barbican-db-sync-5rb8r\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.596297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnx6\" (UniqueName: \"kubernetes.io/projected/d69f3ce2-7166-46f3-8381-987837e3383e-kube-api-access-jxnx6\") pod \"placement-db-sync-x5ns6\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.663983 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-config\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.664202 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-swift-storage-0\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.664265 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-sb\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.664328 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdwqh\" (UniqueName: \"kubernetes.io/projected/f49ffdce-d078-4718-95a4-df84a9d7abda-kube-api-access-cdwqh\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.664356 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-nb\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.664388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.665721 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-sb\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.666179 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-swift-storage-0\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.666601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-nb\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.667389 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.667925 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-config\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.674117 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.686954 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5ns6" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.702591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdwqh\" (UniqueName: \"kubernetes.io/projected/f49ffdce-d078-4718-95a4-df84a9d7abda-kube-api-access-cdwqh\") pod \"dnsmasq-dns-79d475cf97-tzr56\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.801040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.807471 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.867969 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789bb6d89c-l5c8s"] Mar 13 12:09:37 crc kubenswrapper[4786]: I0313 12:09:37.953779 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bzltb"] Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.157589 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-59gr5"] Mar 13 12:09:38 crc kubenswrapper[4786]: W0313 12:09:38.184140 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d0fe660_4646_4b25_b5b6_b24989d78be4.slice/crio-bc480304a1fe5865f4caeba93f47411f510feb6f839716604f1aafa45381d28f WatchSource:0}: Error finding container bc480304a1fe5865f4caeba93f47411f510feb6f839716604f1aafa45381d28f: Status 404 returned error can't find the container with id bc480304a1fe5865f4caeba93f47411f510feb6f839716604f1aafa45381d28f Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.196491 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-prqp8"] Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.325458 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8shl7" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.326959 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x5ns6"] Mar 13 12:09:38 crc kubenswrapper[4786]: W0313 12:09:38.352164 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69f3ce2_7166_46f3_8381_987837e3383e.slice/crio-d6c12a35ad8e7a4161a9a6ec101c8134f50fe79ef1470636e95afe7578c2848b WatchSource:0}: Error finding container d6c12a35ad8e7a4161a9a6ec101c8134f50fe79ef1470636e95afe7578c2848b: Status 404 returned error can't find the container with id d6c12a35ad8e7a4161a9a6ec101c8134f50fe79ef1470636e95afe7578c2848b Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.390532 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-config-data\") pod \"a594fa40-6352-480d-8927-c04bf51c9c51\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.391386 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-combined-ca-bundle\") pod \"a594fa40-6352-480d-8927-c04bf51c9c51\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.391431 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvbkd\" (UniqueName: \"kubernetes.io/projected/a594fa40-6352-480d-8927-c04bf51c9c51-kube-api-access-dvbkd\") pod \"a594fa40-6352-480d-8927-c04bf51c9c51\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.391482 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-db-sync-config-data\") pod \"a594fa40-6352-480d-8927-c04bf51c9c51\" (UID: \"a594fa40-6352-480d-8927-c04bf51c9c51\") " Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.394940 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a594fa40-6352-480d-8927-c04bf51c9c51" (UID: "a594fa40-6352-480d-8927-c04bf51c9c51"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.395178 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a594fa40-6352-480d-8927-c04bf51c9c51-kube-api-access-dvbkd" (OuterVolumeSpecName: "kube-api-access-dvbkd") pod "a594fa40-6352-480d-8927-c04bf51c9c51" (UID: "a594fa40-6352-480d-8927-c04bf51c9c51"). InnerVolumeSpecName "kube-api-access-dvbkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.431420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a594fa40-6352-480d-8927-c04bf51c9c51" (UID: "a594fa40-6352-480d-8927-c04bf51c9c51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.466263 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-config-data" (OuterVolumeSpecName: "config-data") pod "a594fa40-6352-480d-8927-c04bf51c9c51" (UID: "a594fa40-6352-480d-8927-c04bf51c9c51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.487918 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.493298 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.493324 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvbkd\" (UniqueName: \"kubernetes.io/projected/a594fa40-6352-480d-8927-c04bf51c9c51-kube-api-access-dvbkd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.493334 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.493342 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a594fa40-6352-480d-8927-c04bf51c9c51-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.497748 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5rb8r"] Mar 13 12:09:38 crc kubenswrapper[4786]: W0313 12:09:38.505075 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod108a37cf_a5a0_4ffd_b609_bb0bd4d28bfb.slice/crio-bfbb611e4387ad5b69833822f4fed363ded3fc70c736bd780cad7a7a3971a223 WatchSource:0}: Error finding container bfbb611e4387ad5b69833822f4fed363ded3fc70c736bd780cad7a7a3971a223: Status 404 returned error can't find the container with id bfbb611e4387ad5b69833822f4fed363ded3fc70c736bd780cad7a7a3971a223 Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.535799 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d475cf97-tzr56"] Mar 13 12:09:38 crc kubenswrapper[4786]: W0313 12:09:38.549987 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf49ffdce_d078_4718_95a4_df84a9d7abda.slice/crio-f9447354095ecc1a35d53f50a79f337b3bd5147ba65d9c09e4d3115404efcd70 WatchSource:0}: Error finding container f9447354095ecc1a35d53f50a79f337b3bd5147ba65d9c09e4d3115404efcd70: Status 404 returned error can't find the container with id f9447354095ecc1a35d53f50a79f337b3bd5147ba65d9c09e4d3115404efcd70 Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.625580 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prqp8" event={"ID":"07961f7a-7824-4e7d-b30a-e47699b2ca0f","Type":"ContainerStarted","Data":"e21765bcd30df602e32b19fc2333d47e765300a461805329a8f59e62307e6b89"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.634372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-59gr5" event={"ID":"6d0fe660-4646-4b25-b5b6-b24989d78be4","Type":"ContainerStarted","Data":"bc480304a1fe5865f4caeba93f47411f510feb6f839716604f1aafa45381d28f"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.635561 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5ns6" event={"ID":"d69f3ce2-7166-46f3-8381-987837e3383e","Type":"ContainerStarted","Data":"d6c12a35ad8e7a4161a9a6ec101c8134f50fe79ef1470636e95afe7578c2848b"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.639178 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bzltb" event={"ID":"f0a492f1-4ef8-4fc6-86a4-03e13932d63a","Type":"ContainerStarted","Data":"685e02175a791afca8074c1ecfc142942819db375469354325635b81b8371cce"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.641793 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerStarted","Data":"b6145fcf8363eb1b8be26b0e9be40f06c0ecdde5f2973f5a3d5a25ea5b9501b5"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.644169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5rb8r" event={"ID":"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb","Type":"ContainerStarted","Data":"bfbb611e4387ad5b69833822f4fed363ded3fc70c736bd780cad7a7a3971a223"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.654455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" event={"ID":"ce89ba18-706e-460f-97db-35a3d5008268","Type":"ContainerStarted","Data":"a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.654661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" event={"ID":"ce89ba18-706e-460f-97db-35a3d5008268","Type":"ContainerStarted","Data":"97dcabc82317146accef4e42a70bcd33fc7702ac6dd1ea8ccc7aeb4030c24c39"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.692157 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" event={"ID":"f49ffdce-d078-4718-95a4-df84a9d7abda","Type":"ContainerStarted","Data":"f9447354095ecc1a35d53f50a79f337b3bd5147ba65d9c09e4d3115404efcd70"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.695107 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bzltb" podStartSLOduration=2.695082648 podStartE2EDuration="2.695082648s" podCreationTimestamp="2026-03-13 12:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:38.673379379 +0000 UTC m=+1365.953032846" watchObservedRunningTime="2026-03-13 12:09:38.695082648 +0000 UTC m=+1365.974736095" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.700087 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8shl7" event={"ID":"a594fa40-6352-480d-8927-c04bf51c9c51","Type":"ContainerDied","Data":"10933b84917754f26775161eea8e84c490f5925d3d157dfe473aa1b08e695c05"} Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.700125 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10933b84917754f26775161eea8e84c490f5925d3d157dfe473aa1b08e695c05" Mar 13 12:09:38 crc kubenswrapper[4786]: I0313 12:09:38.700175 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8shl7" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.066266 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.105443 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-config\") pod \"ce89ba18-706e-460f-97db-35a3d5008268\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.105512 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-swift-storage-0\") pod \"ce89ba18-706e-460f-97db-35a3d5008268\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.105577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-svc\") pod \"ce89ba18-706e-460f-97db-35a3d5008268\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.105616 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-sb\") pod \"ce89ba18-706e-460f-97db-35a3d5008268\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.105684 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59c97\" (UniqueName: \"kubernetes.io/projected/ce89ba18-706e-460f-97db-35a3d5008268-kube-api-access-59c97\") pod \"ce89ba18-706e-460f-97db-35a3d5008268\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.105703 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-nb\") pod \"ce89ba18-706e-460f-97db-35a3d5008268\" (UID: \"ce89ba18-706e-460f-97db-35a3d5008268\") " Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.150516 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce89ba18-706e-460f-97db-35a3d5008268" (UID: "ce89ba18-706e-460f-97db-35a3d5008268"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.153401 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce89ba18-706e-460f-97db-35a3d5008268" (UID: "ce89ba18-706e-460f-97db-35a3d5008268"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.153839 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-config" (OuterVolumeSpecName: "config") pod "ce89ba18-706e-460f-97db-35a3d5008268" (UID: "ce89ba18-706e-460f-97db-35a3d5008268"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.167458 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce89ba18-706e-460f-97db-35a3d5008268-kube-api-access-59c97" (OuterVolumeSpecName: "kube-api-access-59c97") pod "ce89ba18-706e-460f-97db-35a3d5008268" (UID: "ce89ba18-706e-460f-97db-35a3d5008268"). InnerVolumeSpecName "kube-api-access-59c97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.168125 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d475cf97-tzr56"] Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.174070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce89ba18-706e-460f-97db-35a3d5008268" (UID: "ce89ba18-706e-460f-97db-35a3d5008268"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.213586 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.213638 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59c97\" (UniqueName: \"kubernetes.io/projected/ce89ba18-706e-460f-97db-35a3d5008268-kube-api-access-59c97\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.213655 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.213667 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.213700 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.218111 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fd49cc99-szjf2"] Mar 13 12:09:39 crc kubenswrapper[4786]: E0313 12:09:39.218516 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a594fa40-6352-480d-8927-c04bf51c9c51" containerName="glance-db-sync" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.218531 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a594fa40-6352-480d-8927-c04bf51c9c51" containerName="glance-db-sync" Mar 13 12:09:39 crc kubenswrapper[4786]: E0313 12:09:39.218566 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce89ba18-706e-460f-97db-35a3d5008268" containerName="init" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.218574 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce89ba18-706e-460f-97db-35a3d5008268" containerName="init" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.218765 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce89ba18-706e-460f-97db-35a3d5008268" containerName="init" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.218777 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a594fa40-6352-480d-8927-c04bf51c9c51" containerName="glance-db-sync" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.225700 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.228534 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce89ba18-706e-460f-97db-35a3d5008268" (UID: "ce89ba18-706e-460f-97db-35a3d5008268"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.241405 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd49cc99-szjf2"] Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.315422 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce89ba18-706e-460f-97db-35a3d5008268-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.418096 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.418151 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.418197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-svc\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.418220 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.418248 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkr8p\" (UniqueName: \"kubernetes.io/projected/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-kube-api-access-wkr8p\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.418290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-config\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.520568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.520639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.520700 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-svc\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.520719 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.520744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkr8p\" (UniqueName: \"kubernetes.io/projected/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-kube-api-access-wkr8p\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.520807 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-config\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.522056 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.522686 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-config\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.522736 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.523228 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-svc\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.523899 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.555469 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkr8p\" (UniqueName: \"kubernetes.io/projected/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-kube-api-access-wkr8p\") pod \"dnsmasq-dns-6fd49cc99-szjf2\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.582062 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.717056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prqp8" event={"ID":"07961f7a-7824-4e7d-b30a-e47699b2ca0f","Type":"ContainerStarted","Data":"ae16e2216939862263bfe245efece6c23823d38bfbc785950f78f2415d0c22ac"} Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.722845 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce89ba18-706e-460f-97db-35a3d5008268" containerID="a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8" exitCode=0 Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.723002 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.723102 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" event={"ID":"ce89ba18-706e-460f-97db-35a3d5008268","Type":"ContainerDied","Data":"a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8"} Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.723153 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789bb6d89c-l5c8s" event={"ID":"ce89ba18-706e-460f-97db-35a3d5008268","Type":"ContainerDied","Data":"97dcabc82317146accef4e42a70bcd33fc7702ac6dd1ea8ccc7aeb4030c24c39"} Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.723170 4786 scope.go:117] "RemoveContainer" containerID="a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.730582 4786 generic.go:334] "Generic (PLEG): container finished" podID="f49ffdce-d078-4718-95a4-df84a9d7abda" containerID="e434f0330ed6464eec1daaa0a8198d7bbaaef9e54400184169c3fb5f66c11391" exitCode=0 Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.730945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" event={"ID":"f49ffdce-d078-4718-95a4-df84a9d7abda","Type":"ContainerDied","Data":"e434f0330ed6464eec1daaa0a8198d7bbaaef9e54400184169c3fb5f66c11391"} Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.759794 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-prqp8" podStartSLOduration=2.7597754070000002 podStartE2EDuration="2.759775407s" podCreationTimestamp="2026-03-13 12:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:39.755007107 +0000 UTC m=+1367.034660554" watchObservedRunningTime="2026-03-13 12:09:39.759775407 +0000 UTC m=+1367.039428854" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.767671 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bzltb" event={"ID":"f0a492f1-4ef8-4fc6-86a4-03e13932d63a","Type":"ContainerStarted","Data":"fc27704e4bdbce3b44659c515a380b307831cf8584a32d829b419b58f811d1da"} Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.873910 4786 scope.go:117] "RemoveContainer" containerID="a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8" Mar 13 12:09:39 crc kubenswrapper[4786]: E0313 12:09:39.875413 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8\": container with ID starting with a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8 not found: ID does not exist" containerID="a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.875445 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8"} err="failed to get container status \"a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8\": rpc error: code = NotFound desc = could not find container \"a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8\": container with ID starting with a0d1124c44f75987d781cef2d1bc966bd0cbf35ce5620aba9b5ce61e7f2b28d8 not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.931841 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789bb6d89c-l5c8s"] Mar 13 12:09:39 crc kubenswrapper[4786]: I0313 12:09:39.967707 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-789bb6d89c-l5c8s"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.092051 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.093467 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.097527 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8lmx7" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.097712 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.097852 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.114566 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.199832 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26fft\" (UniqueName: \"kubernetes.io/projected/4fbe3fc9-f366-4a6d-9807-3f791de69400-kube-api-access-26fft\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.200257 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.200304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.200338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-logs\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.200393 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.200410 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.200430 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.211497 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.256134 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.264745 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.268455 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.297671 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd49cc99-szjf2"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303280 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303328 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-logs\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303371 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303408 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303456 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26fft\" (UniqueName: \"kubernetes.io/projected/4fbe3fc9-f366-4a6d-9807-3f791de69400-kube-api-access-26fft\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303513 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.303816 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.317919 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.318290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-logs\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.318505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.322104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.324510 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.325073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.349915 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.363715 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26fft\" (UniqueName: \"kubernetes.io/projected/4fbe3fc9-f366-4a6d-9807-3f791de69400-kube-api-access-26fft\") pod \"glance-default-external-api-0\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.409569 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.409639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nl9z\" (UniqueName: \"kubernetes.io/projected/e9af0e12-9ec4-43a8-9423-4c0acf818964-kube-api-access-9nl9z\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.409662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.409781 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.409943 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.409984 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.410089 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: E0313 12:09:40.460464 4786 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 13 12:09:40 crc kubenswrapper[4786]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f49ffdce-d078-4718-95a4-df84a9d7abda/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 12:09:40 crc kubenswrapper[4786]: > podSandboxID="f9447354095ecc1a35d53f50a79f337b3bd5147ba65d9c09e4d3115404efcd70" Mar 13 12:09:40 crc kubenswrapper[4786]: E0313 12:09:40.460671 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 12:09:40 crc kubenswrapper[4786]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66hbbh89h5c9h565h54bh5dch67fh7bh8fh76h5d7h689hdh5ddh59h568h56ch656h647h555hc6h579hcch68bh65h6dhbh677h68bh694h59dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cdwqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-79d475cf97-tzr56_openstack(f49ffdce-d078-4718-95a4-df84a9d7abda): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f49ffdce-d078-4718-95a4-df84a9d7abda/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 12:09:40 crc kubenswrapper[4786]: > logger="UnhandledError" Mar 13 12:09:40 crc kubenswrapper[4786]: E0313 12:09:40.463042 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f49ffdce-d078-4718-95a4-df84a9d7abda/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" podUID="f49ffdce-d078-4718-95a4-df84a9d7abda" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512023 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512105 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512128 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512529 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512589 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nl9z\" (UniqueName: \"kubernetes.io/projected/e9af0e12-9ec4-43a8-9423-4c0acf818964-kube-api-access-9nl9z\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.512610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.513265 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.513286 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.513458 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.513919 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.524105 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.528783 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.533496 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nl9z\" (UniqueName: \"kubernetes.io/projected/e9af0e12-9ec4-43a8-9423-4c0acf818964-kube-api-access-9nl9z\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.536085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.553727 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.585070 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.821936 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" event={"ID":"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b","Type":"ContainerStarted","Data":"73e9aa5fef6e5eca89e38f358224f2e80d0cb67879f44375d4c172979300e66a"} Mar 13 12:09:40 crc kubenswrapper[4786]: I0313 12:09:40.822259 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" event={"ID":"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b","Type":"ContainerStarted","Data":"3434b1e26cfefd8c4af479ef69443b0975fa36cb1747c9c56ffc31e4c54f847c"} Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.269669 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.336364 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.433918 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.433987 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdwqh\" (UniqueName: \"kubernetes.io/projected/f49ffdce-d078-4718-95a4-df84a9d7abda-kube-api-access-cdwqh\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.434048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-sb\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.434141 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-swift-storage-0\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.434200 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-config\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.434273 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-nb\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.440502 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49ffdce-d078-4718-95a4-df84a9d7abda-kube-api-access-cdwqh" (OuterVolumeSpecName: "kube-api-access-cdwqh") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "kube-api-access-cdwqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.469363 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce89ba18-706e-460f-97db-35a3d5008268" path="/var/lib/kubelet/pods/ce89ba18-706e-460f-97db-35a3d5008268/volumes" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.502707 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.535993 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.536130 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc\") pod \"f49ffdce-d078-4718-95a4-df84a9d7abda\" (UID: \"f49ffdce-d078-4718-95a4-df84a9d7abda\") " Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.534809 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.538381 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdwqh\" (UniqueName: \"kubernetes.io/projected/f49ffdce-d078-4718-95a4-df84a9d7abda-kube-api-access-cdwqh\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.538399 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.538408 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:41 crc kubenswrapper[4786]: W0313 12:09:41.540051 4786 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f49ffdce-d078-4718-95a4-df84a9d7abda/volumes/kubernetes.io~configmap/dns-svc Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.540079 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.541155 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-config" (OuterVolumeSpecName: "config") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.550385 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f49ffdce-d078-4718-95a4-df84a9d7abda" (UID: "f49ffdce-d078-4718-95a4-df84a9d7abda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.597060 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.641147 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.641186 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.641201 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49ffdce-d078-4718-95a4-df84a9d7abda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.850520 4786 generic.go:334] "Generic (PLEG): container finished" podID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerID="73e9aa5fef6e5eca89e38f358224f2e80d0cb67879f44375d4c172979300e66a" exitCode=0 Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.850601 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" event={"ID":"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b","Type":"ContainerDied","Data":"73e9aa5fef6e5eca89e38f358224f2e80d0cb67879f44375d4c172979300e66a"} Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.855917 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbe3fc9-f366-4a6d-9807-3f791de69400","Type":"ContainerStarted","Data":"2887b42fb39c1285b2a22b76014f7d0afecb71e1021c362573bfa9999c3dd62d"} Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.860477 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9af0e12-9ec4-43a8-9423-4c0acf818964","Type":"ContainerStarted","Data":"297c20ddf1dcd52cc9afceb04e71123f4d77c11293f72666aa8cf6e492cd5580"} Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.863521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" event={"ID":"f49ffdce-d078-4718-95a4-df84a9d7abda","Type":"ContainerDied","Data":"f9447354095ecc1a35d53f50a79f337b3bd5147ba65d9c09e4d3115404efcd70"} Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.863563 4786 scope.go:117] "RemoveContainer" containerID="e434f0330ed6464eec1daaa0a8198d7bbaaef9e54400184169c3fb5f66c11391" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.863748 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d475cf97-tzr56" Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.958662 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d475cf97-tzr56"] Mar 13 12:09:41 crc kubenswrapper[4786]: I0313 12:09:41.990503 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79d475cf97-tzr56"] Mar 13 12:09:42 crc kubenswrapper[4786]: I0313 12:09:42.883574 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" event={"ID":"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b","Type":"ContainerStarted","Data":"c5588970471f8b564c9ea9a71679e58b680a67f6302cf48e6bf143ac427f7437"} Mar 13 12:09:42 crc kubenswrapper[4786]: I0313 12:09:42.884247 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:42 crc kubenswrapper[4786]: I0313 12:09:42.890078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbe3fc9-f366-4a6d-9807-3f791de69400","Type":"ContainerStarted","Data":"090839b5307ceea77ba9e0fa3a0dd186e2d0ed3452aa2fb26ea4c031367e9cb8"} Mar 13 12:09:42 crc kubenswrapper[4786]: I0313 12:09:42.891819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9af0e12-9ec4-43a8-9423-4c0acf818964","Type":"ContainerStarted","Data":"1478d91f7c847461679fb712e32b0e6e265007b23125e2532a73b00ca09929c0"} Mar 13 12:09:43 crc kubenswrapper[4786]: I0313 12:09:43.465662 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49ffdce-d078-4718-95a4-df84a9d7abda" path="/var/lib/kubelet/pods/f49ffdce-d078-4718-95a4-df84a9d7abda/volumes" Mar 13 12:09:43 crc kubenswrapper[4786]: I0313 12:09:43.480606 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" podStartSLOduration=4.480571945 podStartE2EDuration="4.480571945s" podCreationTimestamp="2026-03-13 12:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:42.905940404 +0000 UTC m=+1370.185593871" watchObservedRunningTime="2026-03-13 12:09:43.480571945 +0000 UTC m=+1370.760225392" Mar 13 12:09:43 crc kubenswrapper[4786]: I0313 12:09:43.910853 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbe3fc9-f366-4a6d-9807-3f791de69400","Type":"ContainerStarted","Data":"99fa8e25f59b68ca69e1b2b23fb13abc773e7a9e33d069273dd97cc2048bf2a3"} Mar 13 12:09:43 crc kubenswrapper[4786]: I0313 12:09:43.933870 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.933854085 podStartE2EDuration="3.933854085s" podCreationTimestamp="2026-03-13 12:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:43.932688713 +0000 UTC m=+1371.212342180" watchObservedRunningTime="2026-03-13 12:09:43.933854085 +0000 UTC m=+1371.213507532" Mar 13 12:09:44 crc kubenswrapper[4786]: I0313 12:09:44.922416 4786 generic.go:334] "Generic (PLEG): container finished" podID="f0a492f1-4ef8-4fc6-86a4-03e13932d63a" containerID="fc27704e4bdbce3b44659c515a380b307831cf8584a32d829b419b58f811d1da" exitCode=0 Mar 13 12:09:44 crc kubenswrapper[4786]: I0313 12:09:44.922501 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bzltb" event={"ID":"f0a492f1-4ef8-4fc6-86a4-03e13932d63a","Type":"ContainerDied","Data":"fc27704e4bdbce3b44659c515a380b307831cf8584a32d829b419b58f811d1da"} Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.561924 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.562659 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-log" containerID="cri-o://090839b5307ceea77ba9e0fa3a0dd186e2d0ed3452aa2fb26ea4c031367e9cb8" gracePeriod=30 Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.563084 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-httpd" containerID="cri-o://99fa8e25f59b68ca69e1b2b23fb13abc773e7a9e33d069273dd97cc2048bf2a3" gracePeriod=30 Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.617688 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.989241 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerID="99fa8e25f59b68ca69e1b2b23fb13abc773e7a9e33d069273dd97cc2048bf2a3" exitCode=0 Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.989276 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerID="090839b5307ceea77ba9e0fa3a0dd186e2d0ed3452aa2fb26ea4c031367e9cb8" exitCode=143 Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.989300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbe3fc9-f366-4a6d-9807-3f791de69400","Type":"ContainerDied","Data":"99fa8e25f59b68ca69e1b2b23fb13abc773e7a9e33d069273dd97cc2048bf2a3"} Mar 13 12:09:47 crc kubenswrapper[4786]: I0313 12:09:47.989325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbe3fc9-f366-4a6d-9807-3f791de69400","Type":"ContainerDied","Data":"090839b5307ceea77ba9e0fa3a0dd186e2d0ed3452aa2fb26ea4c031367e9cb8"} Mar 13 12:09:49 crc kubenswrapper[4786]: I0313 12:09:49.584116 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:09:49 crc kubenswrapper[4786]: I0313 12:09:49.694466 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8b998f77-rn5ds"] Mar 13 12:09:49 crc kubenswrapper[4786]: I0313 12:09:49.694740 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" containerID="cri-o://25fafa71eab8767caed9369e6ea23c868d0fcaca877ac8f3111397e078cc1a60" gracePeriod=10 Mar 13 12:09:49 crc kubenswrapper[4786]: I0313 12:09:49.973448 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 13 12:09:50 crc kubenswrapper[4786]: I0313 12:09:50.007714 4786 generic.go:334] "Generic (PLEG): container finished" podID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerID="25fafa71eab8767caed9369e6ea23c868d0fcaca877ac8f3111397e078cc1a60" exitCode=0 Mar 13 12:09:50 crc kubenswrapper[4786]: I0313 12:09:50.007754 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" event={"ID":"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc","Type":"ContainerDied","Data":"25fafa71eab8767caed9369e6ea23c868d0fcaca877ac8f3111397e078cc1a60"} Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.867833 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.934592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-combined-ca-bundle\") pod \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.934650 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-fernet-keys\") pod \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.934691 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-credential-keys\") pod \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.934793 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nppt\" (UniqueName: \"kubernetes.io/projected/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-kube-api-access-7nppt\") pod \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.934852 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-config-data\") pod \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.934894 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-scripts\") pod \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\" (UID: \"f0a492f1-4ef8-4fc6-86a4-03e13932d63a\") " Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.944409 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-kube-api-access-7nppt" (OuterVolumeSpecName: "kube-api-access-7nppt") pod "f0a492f1-4ef8-4fc6-86a4-03e13932d63a" (UID: "f0a492f1-4ef8-4fc6-86a4-03e13932d63a"). InnerVolumeSpecName "kube-api-access-7nppt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.944511 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f0a492f1-4ef8-4fc6-86a4-03e13932d63a" (UID: "f0a492f1-4ef8-4fc6-86a4-03e13932d63a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.947030 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f0a492f1-4ef8-4fc6-86a4-03e13932d63a" (UID: "f0a492f1-4ef8-4fc6-86a4-03e13932d63a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.947691 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-scripts" (OuterVolumeSpecName: "scripts") pod "f0a492f1-4ef8-4fc6-86a4-03e13932d63a" (UID: "f0a492f1-4ef8-4fc6-86a4-03e13932d63a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.972920 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0a492f1-4ef8-4fc6-86a4-03e13932d63a" (UID: "f0a492f1-4ef8-4fc6-86a4-03e13932d63a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:51 crc kubenswrapper[4786]: I0313 12:09:51.974790 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-config-data" (OuterVolumeSpecName: "config-data") pod "f0a492f1-4ef8-4fc6-86a4-03e13932d63a" (UID: "f0a492f1-4ef8-4fc6-86a4-03e13932d63a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.036579 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.036607 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.036616 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.036626 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.036636 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.036645 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nppt\" (UniqueName: \"kubernetes.io/projected/f0a492f1-4ef8-4fc6-86a4-03e13932d63a-kube-api-access-7nppt\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.042319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bzltb" event={"ID":"f0a492f1-4ef8-4fc6-86a4-03e13932d63a","Type":"ContainerDied","Data":"685e02175a791afca8074c1ecfc142942819db375469354325635b81b8371cce"} Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.042368 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685e02175a791afca8074c1ecfc142942819db375469354325635b81b8371cce" Mar 13 12:09:52 crc kubenswrapper[4786]: I0313 12:09:52.042427 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bzltb" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.040722 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bzltb"] Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.048463 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bzltb"] Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.051282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9af0e12-9ec4-43a8-9423-4c0acf818964","Type":"ContainerStarted","Data":"80e9f0c7f318f08d662550102297ea1b9370f0bb2051853a9a6f0f6d1a77e7ff"} Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.051377 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-log" containerID="cri-o://1478d91f7c847461679fb712e32b0e6e265007b23125e2532a73b00ca09929c0" gracePeriod=30 Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.051479 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-httpd" containerID="cri-o://80e9f0c7f318f08d662550102297ea1b9370f0bb2051853a9a6f0f6d1a77e7ff" gracePeriod=30 Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.087931 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.087912968 podStartE2EDuration="13.087912968s" podCreationTimestamp="2026-03-13 12:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:53.072006776 +0000 UTC m=+1380.351660243" watchObservedRunningTime="2026-03-13 12:09:53.087912968 +0000 UTC m=+1380.367566415" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.150755 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gtn8s"] Mar 13 12:09:53 crc kubenswrapper[4786]: E0313 12:09:53.151094 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a492f1-4ef8-4fc6-86a4-03e13932d63a" containerName="keystone-bootstrap" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.151107 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a492f1-4ef8-4fc6-86a4-03e13932d63a" containerName="keystone-bootstrap" Mar 13 12:09:53 crc kubenswrapper[4786]: E0313 12:09:53.151132 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49ffdce-d078-4718-95a4-df84a9d7abda" containerName="init" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.151138 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49ffdce-d078-4718-95a4-df84a9d7abda" containerName="init" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.151307 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a492f1-4ef8-4fc6-86a4-03e13932d63a" containerName="keystone-bootstrap" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.151331 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49ffdce-d078-4718-95a4-df84a9d7abda" containerName="init" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.151956 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.154529 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-66qfn" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.154569 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.154529 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.154780 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.164960 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.171808 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gtn8s"] Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.255435 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-combined-ca-bundle\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.255491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-fernet-keys\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.255623 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-config-data\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.255717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdcz\" (UniqueName: \"kubernetes.io/projected/40623686-b681-4c6b-aa73-5b5ac94e4a4c-kube-api-access-zfdcz\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.255807 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-scripts\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.255833 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-credential-keys\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.357686 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-scripts\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.357999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-credential-keys\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.358221 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-combined-ca-bundle\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.359068 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-fernet-keys\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.359204 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-config-data\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.359315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdcz\" (UniqueName: \"kubernetes.io/projected/40623686-b681-4c6b-aa73-5b5ac94e4a4c-kube-api-access-zfdcz\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.363794 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-combined-ca-bundle\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.364207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-scripts\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.364753 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-credential-keys\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.365097 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-config-data\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.368792 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-fernet-keys\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.377414 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdcz\" (UniqueName: \"kubernetes.io/projected/40623686-b681-4c6b-aa73-5b5ac94e4a4c-kube-api-access-zfdcz\") pod \"keystone-bootstrap-gtn8s\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.457034 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0a492f1-4ef8-4fc6-86a4-03e13932d63a" path="/var/lib/kubelet/pods/f0a492f1-4ef8-4fc6-86a4-03e13932d63a/volumes" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.481552 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-66qfn" Mar 13 12:09:53 crc kubenswrapper[4786]: I0313 12:09:53.489317 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:09:54 crc kubenswrapper[4786]: I0313 12:09:54.060263 4786 generic.go:334] "Generic (PLEG): container finished" podID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerID="80e9f0c7f318f08d662550102297ea1b9370f0bb2051853a9a6f0f6d1a77e7ff" exitCode=0 Mar 13 12:09:54 crc kubenswrapper[4786]: I0313 12:09:54.061132 4786 generic.go:334] "Generic (PLEG): container finished" podID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerID="1478d91f7c847461679fb712e32b0e6e265007b23125e2532a73b00ca09929c0" exitCode=143 Mar 13 12:09:54 crc kubenswrapper[4786]: I0313 12:09:54.061140 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9af0e12-9ec4-43a8-9423-4c0acf818964","Type":"ContainerDied","Data":"80e9f0c7f318f08d662550102297ea1b9370f0bb2051853a9a6f0f6d1a77e7ff"} Mar 13 12:09:54 crc kubenswrapper[4786]: I0313 12:09:54.061295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9af0e12-9ec4-43a8-9423-4c0acf818964","Type":"ContainerDied","Data":"1478d91f7c847461679fb712e32b0e6e265007b23125e2532a73b00ca09929c0"} Mar 13 12:09:54 crc kubenswrapper[4786]: I0313 12:09:54.968586 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 13 12:09:59 crc kubenswrapper[4786]: I0313 12:09:59.968807 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 13 12:09:59 crc kubenswrapper[4786]: I0313 12:09:59.969673 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.144482 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556730-n78jr"] Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.146124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.150113 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.150303 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.150548 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.154421 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-n78jr"] Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.180162 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwgbs\" (UniqueName: \"kubernetes.io/projected/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9-kube-api-access-lwgbs\") pod \"auto-csr-approver-29556730-n78jr\" (UID: \"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9\") " pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.281782 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwgbs\" (UniqueName: \"kubernetes.io/projected/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9-kube-api-access-lwgbs\") pod \"auto-csr-approver-29556730-n78jr\" (UID: \"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9\") " pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.303475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwgbs\" (UniqueName: \"kubernetes.io/projected/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9-kube-api-access-lwgbs\") pod \"auto-csr-approver-29556730-n78jr\" (UID: \"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9\") " pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.362779 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.383578 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-logs\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.383637 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26fft\" (UniqueName: \"kubernetes.io/projected/4fbe3fc9-f366-4a6d-9807-3f791de69400-kube-api-access-26fft\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.383704 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-scripts\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.383748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-httpd-run\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.383861 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-config-data\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.383900 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.384050 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-combined-ca-bundle\") pod \"4fbe3fc9-f366-4a6d-9807-3f791de69400\" (UID: \"4fbe3fc9-f366-4a6d-9807-3f791de69400\") " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.384535 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-logs" (OuterVolumeSpecName: "logs") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.392254 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-scripts" (OuterVolumeSpecName: "scripts") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.392539 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.392848 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.394535 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbe3fc9-f366-4a6d-9807-3f791de69400-kube-api-access-26fft" (OuterVolumeSpecName: "kube-api-access-26fft") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "kube-api-access-26fft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.438537 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.472003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-config-data" (OuterVolumeSpecName: "config-data") pod "4fbe3fc9-f366-4a6d-9807-3f791de69400" (UID: "4fbe3fc9-f366-4a6d-9807-3f791de69400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.485203 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486763 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486795 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486806 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486816 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486827 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26fft\" (UniqueName: \"kubernetes.io/projected/4fbe3fc9-f366-4a6d-9807-3f791de69400-kube-api-access-26fft\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486835 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbe3fc9-f366-4a6d-9807-3f791de69400-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.486844 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbe3fc9-f366-4a6d-9807-3f791de69400-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.506019 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:10:00 crc kubenswrapper[4786]: I0313 12:10:00.589110 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.128199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbe3fc9-f366-4a6d-9807-3f791de69400","Type":"ContainerDied","Data":"2887b42fb39c1285b2a22b76014f7d0afecb71e1021c362573bfa9999c3dd62d"} Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.128239 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.128253 4786 scope.go:117] "RemoveContainer" containerID="99fa8e25f59b68ca69e1b2b23fb13abc773e7a9e33d069273dd97cc2048bf2a3" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.167877 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.181043 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.211748 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:10:01 crc kubenswrapper[4786]: E0313 12:10:01.212181 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-log" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.212204 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-log" Mar 13 12:10:01 crc kubenswrapper[4786]: E0313 12:10:01.212240 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-httpd" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.212246 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-httpd" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.212389 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-log" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.212415 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" containerName="glance-httpd" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.213358 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.221415 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.230437 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.230653 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303015 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-logs\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303301 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbt5\" (UniqueName: \"kubernetes.io/projected/e0be8ed6-db24-42dd-8e7d-406ce46d2787-kube-api-access-4cbt5\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303432 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303472 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303492 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.303509 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405604 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405677 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-logs\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405700 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbt5\" (UniqueName: \"kubernetes.io/projected/e0be8ed6-db24-42dd-8e7d-406ce46d2787-kube-api-access-4cbt5\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.405769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.406494 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.406773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.409136 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-logs\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.412184 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.412743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.417528 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.417826 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.419896 4786 scope.go:117] "RemoveContainer" containerID="090839b5307ceea77ba9e0fa3a0dd186e2d0ed3452aa2fb26ea4c031367e9cb8" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.433817 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbt5\" (UniqueName: \"kubernetes.io/projected/e0be8ed6-db24-42dd-8e7d-406ce46d2787-kube-api-access-4cbt5\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: E0313 12:10:01.453451 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 13 12:10:01 crc kubenswrapper[4786]: E0313 12:10:01.454558 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghrg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-59gr5_openstack(6d0fe660-4646-4b25-b5b6-b24989d78be4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:10:01 crc kubenswrapper[4786]: E0313 12:10:01.456649 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-59gr5" podUID="6d0fe660-4646-4b25-b5b6-b24989d78be4" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.491454 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fbe3fc9-f366-4a6d-9807-3f791de69400" path="/var/lib/kubelet/pods/4fbe3fc9-f366-4a6d-9807-3f791de69400/volumes" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.494047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.578871 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.707792 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.738715 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.814626 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-config-data\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816361 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nl9z\" (UniqueName: \"kubernetes.io/projected/e9af0e12-9ec4-43a8-9423-4c0acf818964-kube-api-access-9nl9z\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816403 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-swift-storage-0\") pod \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816432 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-logs\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816451 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-config\") pod \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816534 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-combined-ca-bundle\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-scripts\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816606 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-svc\") pod \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816624 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jntmc\" (UniqueName: \"kubernetes.io/projected/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-kube-api-access-jntmc\") pod \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816670 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-sb\") pod \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-nb\") pod \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\" (UID: \"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816708 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.816731 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-httpd-run\") pod \"e9af0e12-9ec4-43a8-9423-4c0acf818964\" (UID: \"e9af0e12-9ec4-43a8-9423-4c0acf818964\") " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.818105 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.824345 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9af0e12-9ec4-43a8-9423-4c0acf818964-kube-api-access-9nl9z" (OuterVolumeSpecName: "kube-api-access-9nl9z") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "kube-api-access-9nl9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.825035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-scripts" (OuterVolumeSpecName: "scripts") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.825657 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-logs" (OuterVolumeSpecName: "logs") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.829685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-kube-api-access-jntmc" (OuterVolumeSpecName: "kube-api-access-jntmc") pod "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" (UID: "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc"). InnerVolumeSpecName "kube-api-access-jntmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.852411 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.917674 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gtn8s"] Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.918950 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.918975 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jntmc\" (UniqueName: \"kubernetes.io/projected/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-kube-api-access-jntmc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.919003 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.919012 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.919020 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nl9z\" (UniqueName: \"kubernetes.io/projected/e9af0e12-9ec4-43a8-9423-4c0acf818964-kube-api-access-9nl9z\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.919029 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9af0e12-9ec4-43a8-9423-4c0acf818964-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:01 crc kubenswrapper[4786]: W0313 12:10:01.930635 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40623686_b681_4c6b_aa73_5b5ac94e4a4c.slice/crio-6c9097f6c7fe4749a664032c040f5d338c216cace501c3a8a3e7cbb20f96909d WatchSource:0}: Error finding container 6c9097f6c7fe4749a664032c040f5d338c216cace501c3a8a3e7cbb20f96909d: Status 404 returned error can't find the container with id 6c9097f6c7fe4749a664032c040f5d338c216cace501c3a8a3e7cbb20f96909d Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.936752 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.984132 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" (UID: "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.992343 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 12:10:01 crc kubenswrapper[4786]: I0313 12:10:01.998765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.002307 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-config" (OuterVolumeSpecName: "config") pod "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" (UID: "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.006010 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-config-data" (OuterVolumeSpecName: "config-data") pod "e9af0e12-9ec4-43a8-9423-4c0acf818964" (UID: "e9af0e12-9ec4-43a8-9423-4c0acf818964"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.008272 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" (UID: "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.008777 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" (UID: "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.012283 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" (UID: "b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020389 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020416 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020426 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020434 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020443 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020451 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af0e12-9ec4-43a8-9423-4c0acf818964-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020459 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.020468 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.037281 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-n78jr"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.136761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gtn8s" event={"ID":"40623686-b681-4c6b-aa73-5b5ac94e4a4c","Type":"ContainerStarted","Data":"0d7e4010821ab2b1ddac83a1076c1a1e388750a9cd4820f6725889bda766f5ea"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.136817 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gtn8s" event={"ID":"40623686-b681-4c6b-aa73-5b5ac94e4a4c","Type":"ContainerStarted","Data":"6c9097f6c7fe4749a664032c040f5d338c216cace501c3a8a3e7cbb20f96909d"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.141287 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.141401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8b998f77-rn5ds" event={"ID":"b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc","Type":"ContainerDied","Data":"1085305dc9c9253464712745558eccf5cc77974eb80442e31884291fd12f5a29"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.141436 4786 scope.go:117] "RemoveContainer" containerID="25fafa71eab8767caed9369e6ea23c868d0fcaca877ac8f3111397e078cc1a60" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.143500 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5rb8r" event={"ID":"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb","Type":"ContainerStarted","Data":"766a2819d96407236e8a3bd4f525acafc200a2bab1d1ad0bf70c72c3c07ecc3c"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.149536 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9af0e12-9ec4-43a8-9423-4c0acf818964","Type":"ContainerDied","Data":"297c20ddf1dcd52cc9afceb04e71123f4d77c11293f72666aa8cf6e492cd5580"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.149547 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.159033 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5ns6" event={"ID":"d69f3ce2-7166-46f3-8381-987837e3383e","Type":"ContainerStarted","Data":"2f45e2aaf79647e376432ad16e19542a71767496aecdffdb4d6cc63e555f3298"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.160001 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gtn8s" podStartSLOduration=9.159977225 podStartE2EDuration="9.159977225s" podCreationTimestamp="2026-03-13 12:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:02.153727516 +0000 UTC m=+1389.433380973" watchObservedRunningTime="2026-03-13 12:10:02.159977225 +0000 UTC m=+1389.439630672" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.164066 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerStarted","Data":"187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.167214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-n78jr" event={"ID":"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9","Type":"ContainerStarted","Data":"e4f8d8ebccc315c5dedcf7b6f63feecfaa203edde215b80b510b4bb2dd646bec"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.170366 4786 generic.go:334] "Generic (PLEG): container finished" podID="07961f7a-7824-4e7d-b30a-e47699b2ca0f" containerID="ae16e2216939862263bfe245efece6c23823d38bfbc785950f78f2415d0c22ac" exitCode=0 Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.170424 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prqp8" event={"ID":"07961f7a-7824-4e7d-b30a-e47699b2ca0f","Type":"ContainerDied","Data":"ae16e2216939862263bfe245efece6c23823d38bfbc785950f78f2415d0c22ac"} Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.178775 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5rb8r" podStartSLOduration=2.258105175 podStartE2EDuration="25.178755185s" podCreationTimestamp="2026-03-13 12:09:37 +0000 UTC" firstStartedPulling="2026-03-13 12:09:38.507853728 +0000 UTC m=+1365.787507175" lastFinishedPulling="2026-03-13 12:10:01.428503738 +0000 UTC m=+1388.708157185" observedRunningTime="2026-03-13 12:10:02.175332742 +0000 UTC m=+1389.454986199" watchObservedRunningTime="2026-03-13 12:10:02.178755185 +0000 UTC m=+1389.458408642" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.188800 4786 scope.go:117] "RemoveContainer" containerID="d39ea3a1da6f99cbb62fe791878f5ef9cce1d109815f5a80560e00ab50025aca" Mar 13 12:10:02 crc kubenswrapper[4786]: E0313 12:10:02.189089 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-59gr5" podUID="6d0fe660-4646-4b25-b5b6-b24989d78be4" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.228209 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8b998f77-rn5ds"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.235164 4786 scope.go:117] "RemoveContainer" containerID="80e9f0c7f318f08d662550102297ea1b9370f0bb2051853a9a6f0f6d1a77e7ff" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.238783 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c8b998f77-rn5ds"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.248950 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.265704 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-x5ns6" podStartSLOduration=2.203341888 podStartE2EDuration="25.265684284s" podCreationTimestamp="2026-03-13 12:09:37 +0000 UTC" firstStartedPulling="2026-03-13 12:09:38.364871927 +0000 UTC m=+1365.644525374" lastFinishedPulling="2026-03-13 12:10:01.427214323 +0000 UTC m=+1388.706867770" observedRunningTime="2026-03-13 12:10:02.260979765 +0000 UTC m=+1389.540633232" watchObservedRunningTime="2026-03-13 12:10:02.265684284 +0000 UTC m=+1389.545337731" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.286195 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.295013 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.306137 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:02 crc kubenswrapper[4786]: E0313 12:10:02.312612 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312643 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" Mar 13 12:10:02 crc kubenswrapper[4786]: E0313 12:10:02.312652 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="init" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312659 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="init" Mar 13 12:10:02 crc kubenswrapper[4786]: E0313 12:10:02.312672 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-log" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312680 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-log" Mar 13 12:10:02 crc kubenswrapper[4786]: E0313 12:10:02.312706 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-httpd" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312713 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-httpd" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312854 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-httpd" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312869 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" containerName="dnsmasq-dns" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.312894 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" containerName="glance-log" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.313705 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.318179 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.318381 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.322770 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.361733 4786 scope.go:117] "RemoveContainer" containerID="1478d91f7c847461679fb712e32b0e6e265007b23125e2532a73b00ca09929c0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.427950 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.427999 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.428059 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.428255 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.428346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.428474 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9fj\" (UniqueName: \"kubernetes.io/projected/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-kube-api-access-hc9fj\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.428570 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.428670 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-logs\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.529769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9fj\" (UniqueName: \"kubernetes.io/projected/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-kube-api-access-hc9fj\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.529895 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.529938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-logs\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530006 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530099 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530132 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530577 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530776 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-logs\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.530896 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.534578 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.538531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.542613 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.542802 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.546055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9fj\" (UniqueName: \"kubernetes.io/projected/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-kube-api-access-hc9fj\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.566624 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:02 crc kubenswrapper[4786]: I0313 12:10:02.667638 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.184987 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0be8ed6-db24-42dd-8e7d-406ce46d2787","Type":"ContainerStarted","Data":"4386657e12cfec63465a3b15404c56ebc92434156273aa41cc0e3fb89d3392fd"} Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.185344 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0be8ed6-db24-42dd-8e7d-406ce46d2787","Type":"ContainerStarted","Data":"75f9028b0da1b0c4965f25053bfeefd13e903ecdaf4334db6c4cbfc098a2559c"} Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.214011 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.455205 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc" path="/var/lib/kubelet/pods/b4188daf-2fcb-49ae-b0af-ac1e6b6b6fbc/volumes" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.455938 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9af0e12-9ec4-43a8-9423-4c0acf818964" path="/var/lib/kubelet/pods/e9af0e12-9ec4-43a8-9423-4c0acf818964/volumes" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.621015 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prqp8" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.650304 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8qp5\" (UniqueName: \"kubernetes.io/projected/07961f7a-7824-4e7d-b30a-e47699b2ca0f-kube-api-access-r8qp5\") pod \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.650491 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-combined-ca-bundle\") pod \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.650534 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-config\") pod \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\" (UID: \"07961f7a-7824-4e7d-b30a-e47699b2ca0f\") " Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.663810 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07961f7a-7824-4e7d-b30a-e47699b2ca0f-kube-api-access-r8qp5" (OuterVolumeSpecName: "kube-api-access-r8qp5") pod "07961f7a-7824-4e7d-b30a-e47699b2ca0f" (UID: "07961f7a-7824-4e7d-b30a-e47699b2ca0f"). InnerVolumeSpecName "kube-api-access-r8qp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.684120 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07961f7a-7824-4e7d-b30a-e47699b2ca0f" (UID: "07961f7a-7824-4e7d-b30a-e47699b2ca0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.691007 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-config" (OuterVolumeSpecName: "config") pod "07961f7a-7824-4e7d-b30a-e47699b2ca0f" (UID: "07961f7a-7824-4e7d-b30a-e47699b2ca0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.752855 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.752922 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/07961f7a-7824-4e7d-b30a-e47699b2ca0f-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:03 crc kubenswrapper[4786]: I0313 12:10:03.752934 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8qp5\" (UniqueName: \"kubernetes.io/projected/07961f7a-7824-4e7d-b30a-e47699b2ca0f-kube-api-access-r8qp5\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.211403 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0be8ed6-db24-42dd-8e7d-406ce46d2787","Type":"ContainerStarted","Data":"242cb13505de80d789eccfaca107cdef2f91c7dca24ac7ce426c66c1b256a47f"} Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.228714 4786 generic.go:334] "Generic (PLEG): container finished" podID="855c715a-2a47-4dc6-ac8c-d5443ab2f0f9" containerID="3ffd782cc85ee75660750d9e43f93d793c70c06599115717af65c7e77938cc2e" exitCode=0 Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.228802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-n78jr" event={"ID":"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9","Type":"ContainerDied","Data":"3ffd782cc85ee75660750d9e43f93d793c70c06599115717af65c7e77938cc2e"} Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.258806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prqp8" event={"ID":"07961f7a-7824-4e7d-b30a-e47699b2ca0f","Type":"ContainerDied","Data":"e21765bcd30df602e32b19fc2333d47e765300a461805329a8f59e62307e6b89"} Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.258857 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e21765bcd30df602e32b19fc2333d47e765300a461805329a8f59e62307e6b89" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.259340 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prqp8" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.280412 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.28039175 podStartE2EDuration="3.28039175s" podCreationTimestamp="2026-03-13 12:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:04.245679268 +0000 UTC m=+1391.525332725" watchObservedRunningTime="2026-03-13 12:10:04.28039175 +0000 UTC m=+1391.560045197" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.290479 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b","Type":"ContainerStarted","Data":"7400bda19ed93a6aaa9008fa088ac58b20cc2ac4be45e3e0d28b2b9028eb3230"} Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.290534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b","Type":"ContainerStarted","Data":"3fb8306c9237486f944233ffd6ac1e88267bb27215b5bbb7a6908c10ec2292c8"} Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.414025 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r"] Mar 13 12:10:04 crc kubenswrapper[4786]: E0313 12:10:04.414426 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07961f7a-7824-4e7d-b30a-e47699b2ca0f" containerName="neutron-db-sync" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.414447 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="07961f7a-7824-4e7d-b30a-e47699b2ca0f" containerName="neutron-db-sync" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.414767 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="07961f7a-7824-4e7d-b30a-e47699b2ca0f" containerName="neutron-db-sync" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.416632 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.512114 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r"] Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.545150 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6695497cb-f75lw"] Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.556365 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.569385 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.569596 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mnmqw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.569702 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.569388 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.570536 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-swift-storage-0\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.570568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-svc\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.570619 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgwm\" (UniqueName: \"kubernetes.io/projected/dda617df-8768-489a-8bbd-5876eb587961-kube-api-access-hmgwm\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.570675 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.570717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-config\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.570750 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-sb\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.573113 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6695497cb-f75lw"] Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672526 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgwm\" (UniqueName: \"kubernetes.io/projected/dda617df-8768-489a-8bbd-5876eb587961-kube-api-access-hmgwm\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-config\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672630 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-combined-ca-bundle\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672653 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672692 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-config\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672727 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-ovndb-tls-certs\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672749 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-httpd-config\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672766 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-sb\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfpv\" (UniqueName: \"kubernetes.io/projected/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-kube-api-access-nsfpv\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-swift-storage-0\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.672867 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-svc\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.673725 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-svc\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.674253 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.675625 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-sb\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.675754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-swift-storage-0\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.676070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-config\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.695606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgwm\" (UniqueName: \"kubernetes.io/projected/dda617df-8768-489a-8bbd-5876eb587961-kube-api-access-hmgwm\") pod \"dnsmasq-dns-6d8b7f7f4c-8jb5r\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.768958 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.774127 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfpv\" (UniqueName: \"kubernetes.io/projected/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-kube-api-access-nsfpv\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.774225 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-config\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.774248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-combined-ca-bundle\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.774298 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-ovndb-tls-certs\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.774315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-httpd-config\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.777989 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-config\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.778442 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-ovndb-tls-certs\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.778923 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-combined-ca-bundle\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.792796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfpv\" (UniqueName: \"kubernetes.io/projected/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-kube-api-access-nsfpv\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.793753 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-httpd-config\") pod \"neutron-6695497cb-f75lw\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:04 crc kubenswrapper[4786]: I0313 12:10:04.892521 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.255765 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r"] Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.338137 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerStarted","Data":"ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836"} Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.369110 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b","Type":"ContainerStarted","Data":"a0309324151576503eed901fa51e562b264bd62cc01f6bb7639186ea905eebf9"} Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.379147 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" event={"ID":"dda617df-8768-489a-8bbd-5876eb587961","Type":"ContainerStarted","Data":"8dade165fc54fed3ef517ddce06bb0b42af163a1ba311ed0799d399ddae7020e"} Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.383015 4786 generic.go:334] "Generic (PLEG): container finished" podID="d69f3ce2-7166-46f3-8381-987837e3383e" containerID="2f45e2aaf79647e376432ad16e19542a71767496aecdffdb4d6cc63e555f3298" exitCode=0 Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.383179 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5ns6" event={"ID":"d69f3ce2-7166-46f3-8381-987837e3383e","Type":"ContainerDied","Data":"2f45e2aaf79647e376432ad16e19542a71767496aecdffdb4d6cc63e555f3298"} Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.412048 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.412023595 podStartE2EDuration="3.412023595s" podCreationTimestamp="2026-03-13 12:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:05.397075799 +0000 UTC m=+1392.676729246" watchObservedRunningTime="2026-03-13 12:10:05.412023595 +0000 UTC m=+1392.691677052" Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.501834 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6695497cb-f75lw"] Mar 13 12:10:05 crc kubenswrapper[4786]: W0313 12:10:05.510101 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9ff8d2_f59a_430a_93dd_ad6df3ad0a8e.slice/crio-2202dbfe91b91a147eca2995f53ab20ddf28c423277a61fcbcdcdccdd586bc99 WatchSource:0}: Error finding container 2202dbfe91b91a147eca2995f53ab20ddf28c423277a61fcbcdcdccdd586bc99: Status 404 returned error can't find the container with id 2202dbfe91b91a147eca2995f53ab20ddf28c423277a61fcbcdcdccdd586bc99 Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.763144 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.916088 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwgbs\" (UniqueName: \"kubernetes.io/projected/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9-kube-api-access-lwgbs\") pod \"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9\" (UID: \"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9\") " Mar 13 12:10:05 crc kubenswrapper[4786]: I0313 12:10:05.921072 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9-kube-api-access-lwgbs" (OuterVolumeSpecName: "kube-api-access-lwgbs") pod "855c715a-2a47-4dc6-ac8c-d5443ab2f0f9" (UID: "855c715a-2a47-4dc6-ac8c-d5443ab2f0f9"). InnerVolumeSpecName "kube-api-access-lwgbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.020401 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwgbs\" (UniqueName: \"kubernetes.io/projected/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9-kube-api-access-lwgbs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.399022 4786 generic.go:334] "Generic (PLEG): container finished" podID="108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" containerID="766a2819d96407236e8a3bd4f525acafc200a2bab1d1ad0bf70c72c3c07ecc3c" exitCode=0 Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.399141 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5rb8r" event={"ID":"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb","Type":"ContainerDied","Data":"766a2819d96407236e8a3bd4f525acafc200a2bab1d1ad0bf70c72c3c07ecc3c"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.404014 4786 generic.go:334] "Generic (PLEG): container finished" podID="dda617df-8768-489a-8bbd-5876eb587961" containerID="b4a02fc023117f09e3d3e5a7efa55580e7dde6907de4ab87c55e391577ab23a0" exitCode=0 Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.404102 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" event={"ID":"dda617df-8768-489a-8bbd-5876eb587961","Type":"ContainerDied","Data":"b4a02fc023117f09e3d3e5a7efa55580e7dde6907de4ab87c55e391577ab23a0"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.410243 4786 generic.go:334] "Generic (PLEG): container finished" podID="40623686-b681-4c6b-aa73-5b5ac94e4a4c" containerID="0d7e4010821ab2b1ddac83a1076c1a1e388750a9cd4820f6725889bda766f5ea" exitCode=0 Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.410347 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gtn8s" event={"ID":"40623686-b681-4c6b-aa73-5b5ac94e4a4c","Type":"ContainerDied","Data":"0d7e4010821ab2b1ddac83a1076c1a1e388750a9cd4820f6725889bda766f5ea"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.426901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6695497cb-f75lw" event={"ID":"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e","Type":"ContainerStarted","Data":"fd9f41b8b3315b8b8e67a93d2da46fd1cc3e421939704c02fca39b112c8be3ad"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.426989 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.427005 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6695497cb-f75lw" event={"ID":"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e","Type":"ContainerStarted","Data":"ba1170c5d3d38880fad622163eb1a2c2fe21ed2114dd24e5037faabbbe295718"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.427017 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6695497cb-f75lw" event={"ID":"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e","Type":"ContainerStarted","Data":"2202dbfe91b91a147eca2995f53ab20ddf28c423277a61fcbcdcdccdd586bc99"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.429106 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-n78jr" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.429167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-n78jr" event={"ID":"855c715a-2a47-4dc6-ac8c-d5443ab2f0f9","Type":"ContainerDied","Data":"e4f8d8ebccc315c5dedcf7b6f63feecfaa203edde215b80b510b4bb2dd646bec"} Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.429186 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4f8d8ebccc315c5dedcf7b6f63feecfaa203edde215b80b510b4bb2dd646bec" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.524344 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6695497cb-f75lw" podStartSLOduration=2.524320665 podStartE2EDuration="2.524320665s" podCreationTimestamp="2026-03-13 12:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:06.521272123 +0000 UTC m=+1393.800925590" watchObservedRunningTime="2026-03-13 12:10:06.524320665 +0000 UTC m=+1393.803974122" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.732841 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5ns6" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.816843 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-lq5fw"] Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.822534 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-lq5fw"] Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.861085 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxnx6\" (UniqueName: \"kubernetes.io/projected/d69f3ce2-7166-46f3-8381-987837e3383e-kube-api-access-jxnx6\") pod \"d69f3ce2-7166-46f3-8381-987837e3383e\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.861168 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-scripts\") pod \"d69f3ce2-7166-46f3-8381-987837e3383e\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.861240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-combined-ca-bundle\") pod \"d69f3ce2-7166-46f3-8381-987837e3383e\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.861290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-config-data\") pod \"d69f3ce2-7166-46f3-8381-987837e3383e\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.861343 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69f3ce2-7166-46f3-8381-987837e3383e-logs\") pod \"d69f3ce2-7166-46f3-8381-987837e3383e\" (UID: \"d69f3ce2-7166-46f3-8381-987837e3383e\") " Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.861895 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d69f3ce2-7166-46f3-8381-987837e3383e-logs" (OuterVolumeSpecName: "logs") pod "d69f3ce2-7166-46f3-8381-987837e3383e" (UID: "d69f3ce2-7166-46f3-8381-987837e3383e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.866601 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-scripts" (OuterVolumeSpecName: "scripts") pod "d69f3ce2-7166-46f3-8381-987837e3383e" (UID: "d69f3ce2-7166-46f3-8381-987837e3383e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.870234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69f3ce2-7166-46f3-8381-987837e3383e-kube-api-access-jxnx6" (OuterVolumeSpecName: "kube-api-access-jxnx6") pod "d69f3ce2-7166-46f3-8381-987837e3383e" (UID: "d69f3ce2-7166-46f3-8381-987837e3383e"). InnerVolumeSpecName "kube-api-access-jxnx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.884747 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d69f3ce2-7166-46f3-8381-987837e3383e" (UID: "d69f3ce2-7166-46f3-8381-987837e3383e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.897724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-config-data" (OuterVolumeSpecName: "config-data") pod "d69f3ce2-7166-46f3-8381-987837e3383e" (UID: "d69f3ce2-7166-46f3-8381-987837e3383e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.962935 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69f3ce2-7166-46f3-8381-987837e3383e-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.962971 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxnx6\" (UniqueName: \"kubernetes.io/projected/d69f3ce2-7166-46f3-8381-987837e3383e-kube-api-access-jxnx6\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.962982 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.962991 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:06 crc kubenswrapper[4786]: I0313 12:10:06.962999 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69f3ce2-7166-46f3-8381-987837e3383e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.217781 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5cf57c7fc-2x5rg"] Mar 13 12:10:07 crc kubenswrapper[4786]: E0313 12:10:07.231257 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69f3ce2-7166-46f3-8381-987837e3383e" containerName="placement-db-sync" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.231295 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69f3ce2-7166-46f3-8381-987837e3383e" containerName="placement-db-sync" Mar 13 12:10:07 crc kubenswrapper[4786]: E0313 12:10:07.231316 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855c715a-2a47-4dc6-ac8c-d5443ab2f0f9" containerName="oc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.231323 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="855c715a-2a47-4dc6-ac8c-d5443ab2f0f9" containerName="oc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.231500 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69f3ce2-7166-46f3-8381-987837e3383e" containerName="placement-db-sync" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.231517 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="855c715a-2a47-4dc6-ac8c-d5443ab2f0f9" containerName="oc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.234332 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cf57c7fc-2x5rg"] Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.234480 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.236852 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.247745 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.375941 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-ovndb-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.375993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-public-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.376025 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-combined-ca-bundle\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.376100 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-httpd-config\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.376122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btzwz\" (UniqueName: \"kubernetes.io/projected/1664e190-3182-45ed-8365-e4ac4dccf4cd-kube-api-access-btzwz\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.376149 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-internal-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.376183 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-config\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.451640 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a1d014-b7f1-4009-942d-3a4794a8f675" path="/var/lib/kubelet/pods/b6a1d014-b7f1-4009-942d-3a4794a8f675/volumes" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.452611 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" event={"ID":"dda617df-8768-489a-8bbd-5876eb587961","Type":"ContainerStarted","Data":"7da91ebeef699e9de315563c6fce678e89e604873a19738404bc1764f949d732"} Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.452637 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.453004 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5ns6" event={"ID":"d69f3ce2-7166-46f3-8381-987837e3383e","Type":"ContainerDied","Data":"d6c12a35ad8e7a4161a9a6ec101c8134f50fe79ef1470636e95afe7578c2848b"} Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.453067 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c12a35ad8e7a4161a9a6ec101c8134f50fe79ef1470636e95afe7578c2848b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.453557 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5ns6" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-httpd-config\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477611 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btzwz\" (UniqueName: \"kubernetes.io/projected/1664e190-3182-45ed-8365-e4ac4dccf4cd-kube-api-access-btzwz\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477663 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-internal-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477716 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-config\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477780 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-ovndb-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477819 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-public-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.477943 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-combined-ca-bundle\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.483103 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-combined-ca-bundle\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.486440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-config\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.493601 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" podStartSLOduration=3.493585166 podStartE2EDuration="3.493585166s" podCreationTimestamp="2026-03-13 12:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:07.482131004 +0000 UTC m=+1394.761784451" watchObservedRunningTime="2026-03-13 12:10:07.493585166 +0000 UTC m=+1394.773238613" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.495447 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-public-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.496239 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-ovndb-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.496596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-internal-tls-certs\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.501413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-httpd-config\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.504671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btzwz\" (UniqueName: \"kubernetes.io/projected/1664e190-3182-45ed-8365-e4ac4dccf4cd-kube-api-access-btzwz\") pod \"neutron-5cf57c7fc-2x5rg\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.537656 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7d4596df6b-5xl5b"] Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.539215 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.544591 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7jf6w" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.547748 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.547990 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.548259 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.552069 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.557541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.564151 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d4596df6b-5xl5b"] Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.692516 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-logs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.692621 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-combined-ca-bundle\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.692654 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-config-data\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.692741 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5prl\" (UniqueName: \"kubernetes.io/projected/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-kube-api-access-r5prl\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.692871 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-scripts\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.692945 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-public-tls-certs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.693000 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-internal-tls-certs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.794593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5prl\" (UniqueName: \"kubernetes.io/projected/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-kube-api-access-r5prl\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.794980 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-scripts\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.795047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-public-tls-certs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.795087 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-internal-tls-certs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.795208 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-logs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.795420 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-combined-ca-bundle\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.795470 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-config-data\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.804445 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-public-tls-certs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.804982 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-internal-tls-certs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.805157 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-config-data\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.806189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-logs\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.808584 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-combined-ca-bundle\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.811750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-scripts\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.818711 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5prl\" (UniqueName: \"kubernetes.io/projected/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-kube-api-access-r5prl\") pod \"placement-7d4596df6b-5xl5b\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.901767 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:07 crc kubenswrapper[4786]: I0313 12:10:07.919658 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:07.999139 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qklb7\" (UniqueName: \"kubernetes.io/projected/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-kube-api-access-qklb7\") pod \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:07.999254 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-combined-ca-bundle\") pod \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:07.999295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-db-sync-config-data\") pod \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\" (UID: \"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.006354 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-kube-api-access-qklb7" (OuterVolumeSpecName: "kube-api-access-qklb7") pod "108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" (UID: "108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb"). InnerVolumeSpecName "kube-api-access-qklb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.009533 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" (UID: "108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.037525 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.047312 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" (UID: "108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.101197 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-fernet-keys\") pod \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.101268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdcz\" (UniqueName: \"kubernetes.io/projected/40623686-b681-4c6b-aa73-5b5ac94e4a4c-kube-api-access-zfdcz\") pod \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.101351 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-scripts\") pod \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.101505 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-combined-ca-bundle\") pod \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.101615 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-config-data\") pod \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.101681 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-credential-keys\") pod \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\" (UID: \"40623686-b681-4c6b-aa73-5b5ac94e4a4c\") " Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.102120 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qklb7\" (UniqueName: \"kubernetes.io/projected/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-kube-api-access-qklb7\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.102142 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.102155 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.106366 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-scripts" (OuterVolumeSpecName: "scripts") pod "40623686-b681-4c6b-aa73-5b5ac94e4a4c" (UID: "40623686-b681-4c6b-aa73-5b5ac94e4a4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.107433 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "40623686-b681-4c6b-aa73-5b5ac94e4a4c" (UID: "40623686-b681-4c6b-aa73-5b5ac94e4a4c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.108844 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40623686-b681-4c6b-aa73-5b5ac94e4a4c-kube-api-access-zfdcz" (OuterVolumeSpecName: "kube-api-access-zfdcz") pod "40623686-b681-4c6b-aa73-5b5ac94e4a4c" (UID: "40623686-b681-4c6b-aa73-5b5ac94e4a4c"). InnerVolumeSpecName "kube-api-access-zfdcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.117289 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "40623686-b681-4c6b-aa73-5b5ac94e4a4c" (UID: "40623686-b681-4c6b-aa73-5b5ac94e4a4c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.134802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40623686-b681-4c6b-aa73-5b5ac94e4a4c" (UID: "40623686-b681-4c6b-aa73-5b5ac94e4a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.136170 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-config-data" (OuterVolumeSpecName: "config-data") pod "40623686-b681-4c6b-aa73-5b5ac94e4a4c" (UID: "40623686-b681-4c6b-aa73-5b5ac94e4a4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.205858 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.205903 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.205917 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.207551 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.207568 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdcz\" (UniqueName: \"kubernetes.io/projected/40623686-b681-4c6b-aa73-5b5ac94e4a4c-kube-api-access-zfdcz\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.207578 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40623686-b681-4c6b-aa73-5b5ac94e4a4c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:08 crc kubenswrapper[4786]: W0313 12:10:08.277179 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1664e190_3182_45ed_8365_e4ac4dccf4cd.slice/crio-bd75ec3fafeef3f51b860acac24b5b8c3973616b2a298d80c095605f9e9891f5 WatchSource:0}: Error finding container bd75ec3fafeef3f51b860acac24b5b8c3973616b2a298d80c095605f9e9891f5: Status 404 returned error can't find the container with id bd75ec3fafeef3f51b860acac24b5b8c3973616b2a298d80c095605f9e9891f5 Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.283182 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cf57c7fc-2x5rg"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.466776 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gtn8s" event={"ID":"40623686-b681-4c6b-aa73-5b5ac94e4a4c","Type":"ContainerDied","Data":"6c9097f6c7fe4749a664032c040f5d338c216cace501c3a8a3e7cbb20f96909d"} Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.466822 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c9097f6c7fe4749a664032c040f5d338c216cace501c3a8a3e7cbb20f96909d" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.466931 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gtn8s" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.475406 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5rb8r" event={"ID":"108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb","Type":"ContainerDied","Data":"bfbb611e4387ad5b69833822f4fed363ded3fc70c736bd780cad7a7a3971a223"} Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.475447 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfbb611e4387ad5b69833822f4fed363ded3fc70c736bd780cad7a7a3971a223" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.475514 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5rb8r" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.486123 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf57c7fc-2x5rg" event={"ID":"1664e190-3182-45ed-8365-e4ac4dccf4cd","Type":"ContainerStarted","Data":"bd75ec3fafeef3f51b860acac24b5b8c3973616b2a298d80c095605f9e9891f5"} Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.519374 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d4596df6b-5xl5b"] Mar 13 12:10:08 crc kubenswrapper[4786]: W0313 12:10:08.520713 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb56e5879_af0d_47cc_8ce9_0bc5437c77f3.slice/crio-481d4fcf9dc56ab905bca2c38cecf0e8b2c75a82907616098dbdb54976b11a8b WatchSource:0}: Error finding container 481d4fcf9dc56ab905bca2c38cecf0e8b2c75a82907616098dbdb54976b11a8b: Status 404 returned error can't find the container with id 481d4fcf9dc56ab905bca2c38cecf0e8b2c75a82907616098dbdb54976b11a8b Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.632371 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d7547f9f8-fzqlk"] Mar 13 12:10:08 crc kubenswrapper[4786]: E0313 12:10:08.632983 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" containerName="barbican-db-sync" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.633000 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" containerName="barbican-db-sync" Mar 13 12:10:08 crc kubenswrapper[4786]: E0313 12:10:08.633019 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40623686-b681-4c6b-aa73-5b5ac94e4a4c" containerName="keystone-bootstrap" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.633026 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="40623686-b681-4c6b-aa73-5b5ac94e4a4c" containerName="keystone-bootstrap" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.633205 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" containerName="barbican-db-sync" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.633228 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="40623686-b681-4c6b-aa73-5b5ac94e4a4c" containerName="keystone-bootstrap" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.633794 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.635500 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.635668 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.635784 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.636034 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.636304 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.636427 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-66qfn" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.647387 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7547f9f8-fzqlk"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.699765 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.701605 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.713504 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.713758 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rz4kw" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.713944 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716170 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-scripts\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716274 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-combined-ca-bundle\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716334 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvhsp\" (UniqueName: \"kubernetes.io/projected/c03ed618-9a09-48b0-84d4-873357872d22-kube-api-access-gvhsp\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716393 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-config-data\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716493 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-internal-tls-certs\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-public-tls-certs\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-fernet-keys\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.716679 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-credential-keys\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.750497 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.776019 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-64887cd695-qvt7h"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.777437 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.784097 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847567 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847633 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data-custom\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847652 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wd7g\" (UniqueName: \"kubernetes.io/projected/ce0ba05c-0aea-424d-973a-6a90f9f85683-kube-api-access-9wd7g\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847748 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6931ea0-428f-4f4a-991b-532c8064542f-logs\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847798 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-config-data\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-internal-tls-certs\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.847997 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-public-tls-certs\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848057 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-combined-ca-bundle\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848106 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-fernet-keys\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848159 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce0ba05c-0aea-424d-973a-6a90f9f85683-logs\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848177 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wx9d\" (UniqueName: \"kubernetes.io/projected/b6931ea0-428f-4f4a-991b-532c8064542f-kube-api-access-4wx9d\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848226 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-combined-ca-bundle\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data-custom\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848339 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-credential-keys\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848384 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-scripts\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-combined-ca-bundle\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.848557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvhsp\" (UniqueName: \"kubernetes.io/projected/c03ed618-9a09-48b0-84d4-873357872d22-kube-api-access-gvhsp\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.856325 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-64887cd695-qvt7h"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.890952 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.912277 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-combined-ca-bundle\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.912861 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-public-tls-certs\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.913384 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-fernet-keys\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.913678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-internal-tls-certs\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.913922 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-config-data\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.916046 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-credential-keys\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.918938 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-scripts\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.920435 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvhsp\" (UniqueName: \"kubernetes.io/projected/c03ed618-9a09-48b0-84d4-873357872d22-kube-api-access-gvhsp\") pod \"keystone-d7547f9f8-fzqlk\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.964807 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data-custom\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.964916 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.964939 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data-custom\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.964957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wd7g\" (UniqueName: \"kubernetes.io/projected/ce0ba05c-0aea-424d-973a-6a90f9f85683-kube-api-access-9wd7g\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.964977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.965468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6931ea0-428f-4f4a-991b-532c8064542f-logs\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.965548 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-combined-ca-bundle\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.965591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce0ba05c-0aea-424d-973a-6a90f9f85683-logs\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.965613 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wx9d\" (UniqueName: \"kubernetes.io/projected/b6931ea0-428f-4f4a-991b-532c8064542f-kube-api-access-4wx9d\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.965637 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-combined-ca-bundle\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.967127 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6931ea0-428f-4f4a-991b-532c8064542f-logs\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.968197 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce0ba05c-0aea-424d-973a-6a90f9f85683-logs\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.972781 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.973476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data-custom\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.974148 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.980575 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-7dk2d"] Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.982082 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.991850 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data-custom\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.992614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-combined-ca-bundle\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.992617 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:08 crc kubenswrapper[4786]: I0313 12:10:08.995383 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-combined-ca-bundle\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.003487 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wd7g\" (UniqueName: \"kubernetes.io/projected/ce0ba05c-0aea-424d-973a-6a90f9f85683-kube-api-access-9wd7g\") pod \"barbican-worker-64887cd695-qvt7h\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.012083 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-7dk2d"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.014183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wx9d\" (UniqueName: \"kubernetes.io/projected/b6931ea0-428f-4f4a-991b-532c8064542f-kube-api-access-4wx9d\") pod \"barbican-keystone-listener-5d7f9bf5db-bwzfl\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.029938 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7569c6d56c-2c7lj"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.031538 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.042525 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.050402 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-dbb857556-g9c6x"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.052066 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.066928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-svc\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.067229 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.067356 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmtqj\" (UniqueName: \"kubernetes.io/projected/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-kube-api-access-gmtqj\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.067471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-config\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.067573 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.067719 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.072484 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-dbb857556-g9c6x"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.081528 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7569c6d56c-2c7lj"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.144799 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56fb7846b-ms2zg"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.146630 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.156571 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.160511 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56fb7846b-ms2zg"] Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.168955 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-combined-ca-bundle\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.168997 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b5lb\" (UniqueName: \"kubernetes.io/projected/14222e06-64a4-424f-9b69-cb6d2b62c001-kube-api-access-2b5lb\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169074 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169141 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqzff\" (UniqueName: \"kubernetes.io/projected/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-kube-api-access-vqzff\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-combined-ca-bundle\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-svc\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data-custom\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169273 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14222e06-64a4-424f-9b69-cb6d2b62c001-logs\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169295 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169311 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmtqj\" (UniqueName: \"kubernetes.io/projected/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-kube-api-access-gmtqj\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169326 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-logs\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data-custom\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169364 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-config\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.169383 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.170190 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.170557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-svc\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.170757 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.170997 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.171308 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-config\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.188450 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.189302 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmtqj\" (UniqueName: \"kubernetes.io/projected/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-kube-api-access-gmtqj\") pod \"dnsmasq-dns-77cf8fb985-7dk2d\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21631dfa-11bd-41ad-a325-7c4136be967c-logs\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282561 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data-custom\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282600 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14222e06-64a4-424f-9b69-cb6d2b62c001-logs\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-logs\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282663 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data-custom\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282690 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data-custom\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282765 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-combined-ca-bundle\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282816 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b5lb\" (UniqueName: \"kubernetes.io/projected/14222e06-64a4-424f-9b69-cb6d2b62c001-kube-api-access-2b5lb\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282858 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282924 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqzff\" (UniqueName: \"kubernetes.io/projected/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-kube-api-access-vqzff\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.282988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-combined-ca-bundle\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.283019 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-combined-ca-bundle\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.283534 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14222e06-64a4-424f-9b69-cb6d2b62c001-logs\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.283653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstq9\" (UniqueName: \"kubernetes.io/projected/21631dfa-11bd-41ad-a325-7c4136be967c-kube-api-access-bstq9\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.283868 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-logs\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.294498 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data-custom\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.299442 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-combined-ca-bundle\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.300169 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.300344 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data-custom\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.307323 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.307417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqzff\" (UniqueName: \"kubernetes.io/projected/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-kube-api-access-vqzff\") pod \"barbican-worker-7569c6d56c-2c7lj\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.307822 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b5lb\" (UniqueName: \"kubernetes.io/projected/14222e06-64a4-424f-9b69-cb6d2b62c001-kube-api-access-2b5lb\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.308039 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-combined-ca-bundle\") pod \"barbican-keystone-listener-dbb857556-g9c6x\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.311972 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.356328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.385864 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-combined-ca-bundle\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.386093 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstq9\" (UniqueName: \"kubernetes.io/projected/21631dfa-11bd-41ad-a325-7c4136be967c-kube-api-access-bstq9\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.386190 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21631dfa-11bd-41ad-a325-7c4136be967c-logs\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.386305 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data-custom\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.386382 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.386829 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21631dfa-11bd-41ad-a325-7c4136be967c-logs\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.392302 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data-custom\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.392576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-combined-ca-bundle\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.395313 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.406132 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.408648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstq9\" (UniqueName: \"kubernetes.io/projected/21631dfa-11bd-41ad-a325-7c4136be967c-kube-api-access-bstq9\") pod \"barbican-api-56fb7846b-ms2zg\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.473766 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.498835 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf57c7fc-2x5rg" event={"ID":"1664e190-3182-45ed-8365-e4ac4dccf4cd","Type":"ContainerStarted","Data":"8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720"} Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.500628 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4596df6b-5xl5b" event={"ID":"b56e5879-af0d-47cc-8ce9-0bc5437c77f3","Type":"ContainerStarted","Data":"481d4fcf9dc56ab905bca2c38cecf0e8b2c75a82907616098dbdb54976b11a8b"} Mar 13 12:10:09 crc kubenswrapper[4786]: I0313 12:10:09.501026 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" podUID="dda617df-8768-489a-8bbd-5876eb587961" containerName="dnsmasq-dns" containerID="cri-o://7da91ebeef699e9de315563c6fce678e89e604873a19738404bc1764f949d732" gracePeriod=10 Mar 13 12:10:10 crc kubenswrapper[4786]: I0313 12:10:10.520331 4786 generic.go:334] "Generic (PLEG): container finished" podID="dda617df-8768-489a-8bbd-5876eb587961" containerID="7da91ebeef699e9de315563c6fce678e89e604873a19738404bc1764f949d732" exitCode=0 Mar 13 12:10:10 crc kubenswrapper[4786]: I0313 12:10:10.520371 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" event={"ID":"dda617df-8768-489a-8bbd-5876eb587961","Type":"ContainerDied","Data":"7da91ebeef699e9de315563c6fce678e89e604873a19738404bc1764f949d732"} Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.428684 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9d9fb9c86-4lc8x"] Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.431070 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.441349 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.441631 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.514253 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d9fb9c86-4lc8x"] Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.548923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.549169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-internal-tls-certs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.549202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-combined-ca-bundle\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.549225 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8cx7\" (UniqueName: \"kubernetes.io/projected/124c632a-4ff3-419c-9e26-ba68929feeb7-kube-api-access-s8cx7\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.549289 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/124c632a-4ff3-419c-9e26-ba68929feeb7-logs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.549373 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-public-tls-certs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.549398 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data-custom\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.554699 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" event={"ID":"dda617df-8768-489a-8bbd-5876eb587961","Type":"ContainerDied","Data":"8dade165fc54fed3ef517ddce06bb0b42af163a1ba311ed0799d399ddae7020e"} Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.554745 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dade165fc54fed3ef517ddce06bb0b42af163a1ba311ed0799d399ddae7020e" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.579352 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.579792 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.614578 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.642321 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.659857 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.659920 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-internal-tls-certs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.660057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-combined-ca-bundle\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.660086 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8cx7\" (UniqueName: \"kubernetes.io/projected/124c632a-4ff3-419c-9e26-ba68929feeb7-kube-api-access-s8cx7\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.660202 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/124c632a-4ff3-419c-9e26-ba68929feeb7-logs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.660351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-public-tls-certs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.660397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data-custom\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.665404 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/124c632a-4ff3-419c-9e26-ba68929feeb7-logs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.672737 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data-custom\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.674214 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-internal-tls-certs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.675596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.688566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-public-tls-certs\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.721354 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-combined-ca-bundle\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.725759 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.728757 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8cx7\" (UniqueName: \"kubernetes.io/projected/124c632a-4ff3-419c-9e26-ba68929feeb7-kube-api-access-s8cx7\") pod \"barbican-api-9d9fb9c86-4lc8x\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.762637 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-nb\") pod \"dda617df-8768-489a-8bbd-5876eb587961\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.762685 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-swift-storage-0\") pod \"dda617df-8768-489a-8bbd-5876eb587961\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.762751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmgwm\" (UniqueName: \"kubernetes.io/projected/dda617df-8768-489a-8bbd-5876eb587961-kube-api-access-hmgwm\") pod \"dda617df-8768-489a-8bbd-5876eb587961\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.762851 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-config\") pod \"dda617df-8768-489a-8bbd-5876eb587961\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.762947 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-svc\") pod \"dda617df-8768-489a-8bbd-5876eb587961\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.763035 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-sb\") pod \"dda617df-8768-489a-8bbd-5876eb587961\" (UID: \"dda617df-8768-489a-8bbd-5876eb587961\") " Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.764801 4786 scope.go:117] "RemoveContainer" containerID="8e2665de53fe5c4b48e3c6283ea00b58ec23a8c2f87e821662e474a78b5088b8" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.794562 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda617df-8768-489a-8bbd-5876eb587961-kube-api-access-hmgwm" (OuterVolumeSpecName: "kube-api-access-hmgwm") pod "dda617df-8768-489a-8bbd-5876eb587961" (UID: "dda617df-8768-489a-8bbd-5876eb587961"). InnerVolumeSpecName "kube-api-access-hmgwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.812853 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.865211 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmgwm\" (UniqueName: \"kubernetes.io/projected/dda617df-8768-489a-8bbd-5876eb587961-kube-api-access-hmgwm\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.959677 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dda617df-8768-489a-8bbd-5876eb587961" (UID: "dda617df-8768-489a-8bbd-5876eb587961"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.967055 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.970685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dda617df-8768-489a-8bbd-5876eb587961" (UID: "dda617df-8768-489a-8bbd-5876eb587961"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.973474 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-config" (OuterVolumeSpecName: "config") pod "dda617df-8768-489a-8bbd-5876eb587961" (UID: "dda617df-8768-489a-8bbd-5876eb587961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.983759 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dda617df-8768-489a-8bbd-5876eb587961" (UID: "dda617df-8768-489a-8bbd-5876eb587961"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:11 crc kubenswrapper[4786]: I0313 12:10:11.985756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dda617df-8768-489a-8bbd-5876eb587961" (UID: "dda617df-8768-489a-8bbd-5876eb587961"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.012994 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-7dk2d"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.076280 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.076675 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.076685 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.076696 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dda617df-8768-489a-8bbd-5876eb587961-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.119943 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-64887cd695-qvt7h"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.136452 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-dbb857556-g9c6x"] Mar 13 12:10:12 crc kubenswrapper[4786]: W0313 12:10:12.494419 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6931ea0_428f_4f4a_991b_532c8064542f.slice/crio-c1154dc7a352b71d4c6fab596572415d43429e52d8b4cc11e4d1a4cdcfea4ebb WatchSource:0}: Error finding container c1154dc7a352b71d4c6fab596572415d43429e52d8b4cc11e4d1a4cdcfea4ebb: Status 404 returned error can't find the container with id c1154dc7a352b71d4c6fab596572415d43429e52d8b4cc11e4d1a4cdcfea4ebb Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.502979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56fb7846b-ms2zg"] Mar 13 12:10:12 crc kubenswrapper[4786]: W0313 12:10:12.506918 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21631dfa_11bd_41ad_a325_7c4136be967c.slice/crio-36ccdf4d3e91c690af3c00b001b3aa77d535f66ff901e8e4ce98e41d376668c3 WatchSource:0}: Error finding container 36ccdf4d3e91c690af3c00b001b3aa77d535f66ff901e8e4ce98e41d376668c3: Status 404 returned error can't find the container with id 36ccdf4d3e91c690af3c00b001b3aa77d535f66ff901e8e4ce98e41d376668c3 Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.510828 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.531048 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7569c6d56c-2c7lj"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.536271 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7547f9f8-fzqlk"] Mar 13 12:10:12 crc kubenswrapper[4786]: W0313 12:10:12.560505 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc03ed618_9a09_48b0_84d4_873357872d22.slice/crio-559eb214a9ad3cbb50ac47afa586137df65f794c357381a4dc40c61df0bf8e84 WatchSource:0}: Error finding container 559eb214a9ad3cbb50ac47afa586137df65f794c357381a4dc40c61df0bf8e84: Status 404 returned error can't find the container with id 559eb214a9ad3cbb50ac47afa586137df65f794c357381a4dc40c61df0bf8e84 Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.574268 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d9fb9c86-4lc8x"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.581214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerStarted","Data":"df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.582488 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" event={"ID":"14222e06-64a4-424f-9b69-cb6d2b62c001","Type":"ContainerStarted","Data":"88ab0757184513178d6464ffc74b12848dc38e911717aca34daf0d7bfcaa2800"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.584102 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerID="b058fdc20733428aa91371a37717688b5c9eb0c8f0b8b50f9e4ee0db1ba1c047" exitCode=0 Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.584201 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" event={"ID":"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10","Type":"ContainerDied","Data":"b058fdc20733428aa91371a37717688b5c9eb0c8f0b8b50f9e4ee0db1ba1c047"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.584232 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" event={"ID":"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10","Type":"ContainerStarted","Data":"9c4ea8af88916bd9e0cec388cb8ef7f762338ed64db29b86f1f3984addebfed3"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.589128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64887cd695-qvt7h" event={"ID":"ce0ba05c-0aea-424d-973a-6a90f9f85683","Type":"ContainerStarted","Data":"ad7cfbc827199609fd58d71ed26980539647359b553b7709c20c3a3ad73ed89c"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.592770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf57c7fc-2x5rg" event={"ID":"1664e190-3182-45ed-8365-e4ac4dccf4cd","Type":"ContainerStarted","Data":"2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.593050 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.596454 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" event={"ID":"b6931ea0-428f-4f4a-991b-532c8064542f","Type":"ContainerStarted","Data":"c1154dc7a352b71d4c6fab596572415d43429e52d8b4cc11e4d1a4cdcfea4ebb"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.598119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4596df6b-5xl5b" event={"ID":"b56e5879-af0d-47cc-8ce9-0bc5437c77f3","Type":"ContainerStarted","Data":"a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.598148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4596df6b-5xl5b" event={"ID":"b56e5879-af0d-47cc-8ce9-0bc5437c77f3","Type":"ContainerStarted","Data":"75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.598209 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.598231 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.599945 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.600014 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56fb7846b-ms2zg" event={"ID":"21631dfa-11bd-41ad-a325-7c4136be967c","Type":"ContainerStarted","Data":"36ccdf4d3e91c690af3c00b001b3aa77d535f66ff901e8e4ce98e41d376668c3"} Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.601021 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.601061 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.630304 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5cf57c7fc-2x5rg" podStartSLOduration=5.630286503 podStartE2EDuration="5.630286503s" podCreationTimestamp="2026-03-13 12:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:12.62612612 +0000 UTC m=+1399.905779567" watchObservedRunningTime="2026-03-13 12:10:12.630286503 +0000 UTC m=+1399.909939950" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.651873 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7d4596df6b-5xl5b" podStartSLOduration=5.651857849 podStartE2EDuration="5.651857849s" podCreationTimestamp="2026-03-13 12:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:12.650190252 +0000 UTC m=+1399.929843709" watchObservedRunningTime="2026-03-13 12:10:12.651857849 +0000 UTC m=+1399.931511296" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.668629 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.669271 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.678736 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.693671 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d8b7f7f4c-8jb5r"] Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.714346 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:12 crc kubenswrapper[4786]: I0313 12:10:12.726393 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.466563 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda617df-8768-489a-8bbd-5876eb587961" path="/var/lib/kubelet/pods/dda617df-8768-489a-8bbd-5876eb587961/volumes" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.625245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7547f9f8-fzqlk" event={"ID":"c03ed618-9a09-48b0-84d4-873357872d22","Type":"ContainerStarted","Data":"495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.625288 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7547f9f8-fzqlk" event={"ID":"c03ed618-9a09-48b0-84d4-873357872d22","Type":"ContainerStarted","Data":"559eb214a9ad3cbb50ac47afa586137df65f794c357381a4dc40c61df0bf8e84"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.626302 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.632229 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7569c6d56c-2c7lj" event={"ID":"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5","Type":"ContainerStarted","Data":"b7c402ea2a2d8ff40719f66c2cd8d5d07da290ffae5d34cd97e065ea0e3a53e0"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.643571 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fb9c86-4lc8x" event={"ID":"124c632a-4ff3-419c-9e26-ba68929feeb7","Type":"ContainerStarted","Data":"2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.643833 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fb9c86-4lc8x" event={"ID":"124c632a-4ff3-419c-9e26-ba68929feeb7","Type":"ContainerStarted","Data":"09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.643957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fb9c86-4lc8x" event={"ID":"124c632a-4ff3-419c-9e26-ba68929feeb7","Type":"ContainerStarted","Data":"b7f507a6996927896ac0fac8ac658c01d7c23f1bf0a3cfe59c5613fae465b2d3"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.644737 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.645241 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.646784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56fb7846b-ms2zg" event={"ID":"21631dfa-11bd-41ad-a325-7c4136be967c","Type":"ContainerStarted","Data":"1c6841d048b400b12515dc52640dc6fd7a71fbeef15df148fc954c73be47422f"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.646809 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56fb7846b-ms2zg" event={"ID":"21631dfa-11bd-41ad-a325-7c4136be967c","Type":"ContainerStarted","Data":"873478a5556be3bed25bddaf8ad90b31f8b27f29235630d364b4dd778164e138"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.647588 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.647618 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.661436 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d7547f9f8-fzqlk" podStartSLOduration=5.661412951 podStartE2EDuration="5.661412951s" podCreationTimestamp="2026-03-13 12:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:13.653526417 +0000 UTC m=+1400.933179874" watchObservedRunningTime="2026-03-13 12:10:13.661412951 +0000 UTC m=+1400.941066408" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.663974 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" event={"ID":"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10","Type":"ContainerStarted","Data":"c521db66dc54a5126f8b6cb3105faf44beb7753db75d2421f365e41e76b99123"} Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.664709 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.665843 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.665868 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.691138 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56fb7846b-ms2zg" podStartSLOduration=4.691118547 podStartE2EDuration="4.691118547s" podCreationTimestamp="2026-03-13 12:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:13.681293011 +0000 UTC m=+1400.960946458" watchObservedRunningTime="2026-03-13 12:10:13.691118547 +0000 UTC m=+1400.970772004" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.732086 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" podStartSLOduration=5.732065278 podStartE2EDuration="5.732065278s" podCreationTimestamp="2026-03-13 12:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:13.722989912 +0000 UTC m=+1401.002643359" watchObservedRunningTime="2026-03-13 12:10:13.732065278 +0000 UTC m=+1401.011718725" Mar 13 12:10:13 crc kubenswrapper[4786]: I0313 12:10:13.733142 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9d9fb9c86-4lc8x" podStartSLOduration=2.733135647 podStartE2EDuration="2.733135647s" podCreationTimestamp="2026-03-13 12:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:13.700008738 +0000 UTC m=+1400.979662195" watchObservedRunningTime="2026-03-13 12:10:13.733135647 +0000 UTC m=+1401.012789094" Mar 13 12:10:15 crc kubenswrapper[4786]: I0313 12:10:15.029719 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:10:15 crc kubenswrapper[4786]: I0313 12:10:15.030216 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:10:15 crc kubenswrapper[4786]: I0313 12:10:15.036748 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:10:15 crc kubenswrapper[4786]: I0313 12:10:15.707734 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:10:15 crc kubenswrapper[4786]: I0313 12:10:15.708048 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:10:15 crc kubenswrapper[4786]: I0313 12:10:15.914608 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.103599 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.734176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64887cd695-qvt7h" event={"ID":"ce0ba05c-0aea-424d-973a-6a90f9f85683","Type":"ContainerStarted","Data":"9a2070cc35eb3f535bb3d81baf3135980c55ed9fea89cda596f2eb1d0b82033a"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.734532 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64887cd695-qvt7h" event={"ID":"ce0ba05c-0aea-424d-973a-6a90f9f85683","Type":"ContainerStarted","Data":"b8a5d83e291bea4c8a9ab51d11d47cae0ee27095627ce55a3ffad47a97e3ced8"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.738581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7569c6d56c-2c7lj" event={"ID":"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5","Type":"ContainerStarted","Data":"e0a352ffff2ccb7dffda5744bd27fe06dce994d483616356c61b3ccede71f2c0"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.738636 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7569c6d56c-2c7lj" event={"ID":"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5","Type":"ContainerStarted","Data":"7543fdc8467c20ef9ae263084f0987465a93b904cd4f492afe5d1897a9e24ab1"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.759915 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-64887cd695-qvt7h" podStartSLOduration=5.659009976 podStartE2EDuration="8.759899205s" podCreationTimestamp="2026-03-13 12:10:08 +0000 UTC" firstStartedPulling="2026-03-13 12:10:12.24543626 +0000 UTC m=+1399.525089707" lastFinishedPulling="2026-03-13 12:10:15.346325489 +0000 UTC m=+1402.625978936" observedRunningTime="2026-03-13 12:10:16.756013709 +0000 UTC m=+1404.035667156" watchObservedRunningTime="2026-03-13 12:10:16.759899205 +0000 UTC m=+1404.039552662" Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.760119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" event={"ID":"14222e06-64a4-424f-9b69-cb6d2b62c001","Type":"ContainerStarted","Data":"54b263afbc6e8053c748bc362cfd1bd6dfe5af3c222dd7810fe8d319eb6c30f9"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.760173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" event={"ID":"14222e06-64a4-424f-9b69-cb6d2b62c001","Type":"ContainerStarted","Data":"d022ca1a7d8f88f231dbd73accdaf4bd33c41433e29373fd3195771df609a146"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.782013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" event={"ID":"b6931ea0-428f-4f4a-991b-532c8064542f","Type":"ContainerStarted","Data":"838d3467d00145949ebefd73a9d755950c6efb1c51fcebd7cafa505f4c06de12"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.782053 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" event={"ID":"b6931ea0-428f-4f4a-991b-532c8064542f","Type":"ContainerStarted","Data":"ba1fe6100258a62b71d3a9bd554107532f7443518a494f382e2a2ba04fc4c601"} Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.796809 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7569c6d56c-2c7lj" podStartSLOduration=6.015247172 podStartE2EDuration="8.796789785s" podCreationTimestamp="2026-03-13 12:10:08 +0000 UTC" firstStartedPulling="2026-03-13 12:10:12.566522053 +0000 UTC m=+1399.846175490" lastFinishedPulling="2026-03-13 12:10:15.348064656 +0000 UTC m=+1402.627718103" observedRunningTime="2026-03-13 12:10:16.76932115 +0000 UTC m=+1404.048974617" watchObservedRunningTime="2026-03-13 12:10:16.796789785 +0000 UTC m=+1404.076443232" Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.829185 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" podStartSLOduration=5.643097023 podStartE2EDuration="8.829156133s" podCreationTimestamp="2026-03-13 12:10:08 +0000 UTC" firstStartedPulling="2026-03-13 12:10:12.158174923 +0000 UTC m=+1399.437828370" lastFinishedPulling="2026-03-13 12:10:15.344234033 +0000 UTC m=+1402.623887480" observedRunningTime="2026-03-13 12:10:16.79619861 +0000 UTC m=+1404.075852057" watchObservedRunningTime="2026-03-13 12:10:16.829156133 +0000 UTC m=+1404.108809590" Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.867610 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-64887cd695-qvt7h"] Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.913194 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" podStartSLOduration=6.065232818 podStartE2EDuration="8.913174353s" podCreationTimestamp="2026-03-13 12:10:08 +0000 UTC" firstStartedPulling="2026-03-13 12:10:12.496860683 +0000 UTC m=+1399.776514130" lastFinishedPulling="2026-03-13 12:10:15.344802218 +0000 UTC m=+1402.624455665" observedRunningTime="2026-03-13 12:10:16.81759695 +0000 UTC m=+1404.097250417" watchObservedRunningTime="2026-03-13 12:10:16.913174353 +0000 UTC m=+1404.192827810" Mar 13 12:10:16 crc kubenswrapper[4786]: I0313 12:10:16.939111 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl"] Mar 13 12:10:17 crc kubenswrapper[4786]: I0313 12:10:17.793655 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-59gr5" event={"ID":"6d0fe660-4646-4b25-b5b6-b24989d78be4","Type":"ContainerStarted","Data":"7d26ec1d9ece2a37cced85d6beca1eecc881de15084de4e7ac289a248517ef2c"} Mar 13 12:10:17 crc kubenswrapper[4786]: I0313 12:10:17.817051 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-59gr5" podStartSLOduration=2.910756333 podStartE2EDuration="40.817034448s" podCreationTimestamp="2026-03-13 12:09:37 +0000 UTC" firstStartedPulling="2026-03-13 12:09:38.191470353 +0000 UTC m=+1365.471123800" lastFinishedPulling="2026-03-13 12:10:16.097748468 +0000 UTC m=+1403.377401915" observedRunningTime="2026-03-13 12:10:17.811082336 +0000 UTC m=+1405.090735783" watchObservedRunningTime="2026-03-13 12:10:17.817034448 +0000 UTC m=+1405.096687885" Mar 13 12:10:18 crc kubenswrapper[4786]: I0313 12:10:18.811097 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener" containerID="cri-o://838d3467d00145949ebefd73a9d755950c6efb1c51fcebd7cafa505f4c06de12" gracePeriod=30 Mar 13 12:10:18 crc kubenswrapper[4786]: I0313 12:10:18.811191 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-64887cd695-qvt7h" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker" containerID="cri-o://9a2070cc35eb3f535bb3d81baf3135980c55ed9fea89cda596f2eb1d0b82033a" gracePeriod=30 Mar 13 12:10:18 crc kubenswrapper[4786]: I0313 12:10:18.811271 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-64887cd695-qvt7h" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker-log" containerID="cri-o://b8a5d83e291bea4c8a9ab51d11d47cae0ee27095627ce55a3ffad47a97e3ced8" gracePeriod=30 Mar 13 12:10:18 crc kubenswrapper[4786]: I0313 12:10:18.810996 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener-log" containerID="cri-o://ba1fe6100258a62b71d3a9bd554107532f7443518a494f382e2a2ba04fc4c601" gracePeriod=30 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.313088 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.376412 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd49cc99-szjf2"] Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.376659 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="dnsmasq-dns" containerID="cri-o://c5588970471f8b564c9ea9a71679e58b680a67f6302cf48e6bf143ac427f7437" gracePeriod=10 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.583766 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.841431 4786 generic.go:334] "Generic (PLEG): container finished" podID="b6931ea0-428f-4f4a-991b-532c8064542f" containerID="838d3467d00145949ebefd73a9d755950c6efb1c51fcebd7cafa505f4c06de12" exitCode=0 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.841468 4786 generic.go:334] "Generic (PLEG): container finished" podID="b6931ea0-428f-4f4a-991b-532c8064542f" containerID="ba1fe6100258a62b71d3a9bd554107532f7443518a494f382e2a2ba04fc4c601" exitCode=143 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.841503 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" event={"ID":"b6931ea0-428f-4f4a-991b-532c8064542f","Type":"ContainerDied","Data":"838d3467d00145949ebefd73a9d755950c6efb1c51fcebd7cafa505f4c06de12"} Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.841554 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" event={"ID":"b6931ea0-428f-4f4a-991b-532c8064542f","Type":"ContainerDied","Data":"ba1fe6100258a62b71d3a9bd554107532f7443518a494f382e2a2ba04fc4c601"} Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.843058 4786 generic.go:334] "Generic (PLEG): container finished" podID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerID="c5588970471f8b564c9ea9a71679e58b680a67f6302cf48e6bf143ac427f7437" exitCode=0 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.843124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" event={"ID":"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b","Type":"ContainerDied","Data":"c5588970471f8b564c9ea9a71679e58b680a67f6302cf48e6bf143ac427f7437"} Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.845469 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerID="9a2070cc35eb3f535bb3d81baf3135980c55ed9fea89cda596f2eb1d0b82033a" exitCode=0 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.845497 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerID="b8a5d83e291bea4c8a9ab51d11d47cae0ee27095627ce55a3ffad47a97e3ced8" exitCode=143 Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.845517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64887cd695-qvt7h" event={"ID":"ce0ba05c-0aea-424d-973a-6a90f9f85683","Type":"ContainerDied","Data":"9a2070cc35eb3f535bb3d81baf3135980c55ed9fea89cda596f2eb1d0b82033a"} Mar 13 12:10:19 crc kubenswrapper[4786]: I0313 12:10:19.845533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64887cd695-qvt7h" event={"ID":"ce0ba05c-0aea-424d-973a-6a90f9f85683","Type":"ContainerDied","Data":"b8a5d83e291bea4c8a9ab51d11d47cae0ee27095627ce55a3ffad47a97e3ced8"} Mar 13 12:10:20 crc kubenswrapper[4786]: I0313 12:10:20.896366 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:20 crc kubenswrapper[4786]: I0313 12:10:20.944317 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.279355 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.292738 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.298098 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.311667 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.321280 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-svc\") pod \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.321433 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data-custom\") pod \"b6931ea0-428f-4f4a-991b-532c8064542f\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322117 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wx9d\" (UniqueName: \"kubernetes.io/projected/b6931ea0-428f-4f4a-991b-532c8064542f-kube-api-access-4wx9d\") pod \"b6931ea0-428f-4f4a-991b-532c8064542f\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322298 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-combined-ca-bundle\") pod \"b6931ea0-428f-4f4a-991b-532c8064542f\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322374 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-sb\") pod \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322479 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6931ea0-428f-4f4a-991b-532c8064542f-logs\") pod \"b6931ea0-428f-4f4a-991b-532c8064542f\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322542 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-config\") pod \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-nb\") pod \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322656 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-swift-storage-0\") pod \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322736 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkr8p\" (UniqueName: \"kubernetes.io/projected/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-kube-api-access-wkr8p\") pod \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\" (UID: \"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.322774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data\") pod \"b6931ea0-428f-4f4a-991b-532c8064542f\" (UID: \"b6931ea0-428f-4f4a-991b-532c8064542f\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.323048 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6931ea0-428f-4f4a-991b-532c8064542f-logs" (OuterVolumeSpecName: "logs") pod "b6931ea0-428f-4f4a-991b-532c8064542f" (UID: "b6931ea0-428f-4f4a-991b-532c8064542f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.324158 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6931ea0-428f-4f4a-991b-532c8064542f-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.331478 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.336299 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6931ea0-428f-4f4a-991b-532c8064542f-kube-api-access-4wx9d" (OuterVolumeSpecName: "kube-api-access-4wx9d") pod "b6931ea0-428f-4f4a-991b-532c8064542f" (UID: "b6931ea0-428f-4f4a-991b-532c8064542f"). InnerVolumeSpecName "kube-api-access-4wx9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.339643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6931ea0-428f-4f4a-991b-532c8064542f" (UID: "b6931ea0-428f-4f4a-991b-532c8064542f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.357918 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6931ea0-428f-4f4a-991b-532c8064542f" (UID: "b6931ea0-428f-4f4a-991b-532c8064542f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.366342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-kube-api-access-wkr8p" (OuterVolumeSpecName: "kube-api-access-wkr8p") pod "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" (UID: "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b"). InnerVolumeSpecName "kube-api-access-wkr8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.425424 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-combined-ca-bundle\") pod \"ce0ba05c-0aea-424d-973a-6a90f9f85683\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.425547 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data\") pod \"ce0ba05c-0aea-424d-973a-6a90f9f85683\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.425666 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce0ba05c-0aea-424d-973a-6a90f9f85683-logs\") pod \"ce0ba05c-0aea-424d-973a-6a90f9f85683\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.425772 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wd7g\" (UniqueName: \"kubernetes.io/projected/ce0ba05c-0aea-424d-973a-6a90f9f85683-kube-api-access-9wd7g\") pod \"ce0ba05c-0aea-424d-973a-6a90f9f85683\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.425791 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data-custom\") pod \"ce0ba05c-0aea-424d-973a-6a90f9f85683\" (UID: \"ce0ba05c-0aea-424d-973a-6a90f9f85683\") " Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.426597 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.426728 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wx9d\" (UniqueName: \"kubernetes.io/projected/b6931ea0-428f-4f4a-991b-532c8064542f-kube-api-access-4wx9d\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.426767 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.426804 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkr8p\" (UniqueName: \"kubernetes.io/projected/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-kube-api-access-wkr8p\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.427444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce0ba05c-0aea-424d-973a-6a90f9f85683-logs" (OuterVolumeSpecName: "logs") pod "ce0ba05c-0aea-424d-973a-6a90f9f85683" (UID: "ce0ba05c-0aea-424d-973a-6a90f9f85683"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.432116 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0ba05c-0aea-424d-973a-6a90f9f85683-kube-api-access-9wd7g" (OuterVolumeSpecName: "kube-api-access-9wd7g") pod "ce0ba05c-0aea-424d-973a-6a90f9f85683" (UID: "ce0ba05c-0aea-424d-973a-6a90f9f85683"). InnerVolumeSpecName "kube-api-access-9wd7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.432406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data" (OuterVolumeSpecName: "config-data") pod "b6931ea0-428f-4f4a-991b-532c8064542f" (UID: "b6931ea0-428f-4f4a-991b-532c8064542f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.433060 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ce0ba05c-0aea-424d-973a-6a90f9f85683" (UID: "ce0ba05c-0aea-424d-973a-6a90f9f85683"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.446781 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" (UID: "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.464420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" (UID: "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.470341 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" (UID: "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.488723 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" (UID: "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.502182 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce0ba05c-0aea-424d-973a-6a90f9f85683" (UID: "ce0ba05c-0aea-424d-973a-6a90f9f85683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.502423 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-config" (OuterVolumeSpecName: "config") pod "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" (UID: "ffcc2849-7c78-4b8d-a0ec-e629c3556a6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.532209 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data" (OuterVolumeSpecName: "config-data") pod "ce0ba05c-0aea-424d-973a-6a90f9f85683" (UID: "ce0ba05c-0aea-424d-973a-6a90f9f85683"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533616 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533646 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533656 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533665 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533673 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533681 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533691 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6931ea0-428f-4f4a-991b-532c8064542f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533698 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce0ba05c-0aea-424d-973a-6a90f9f85683-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533706 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533713 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wd7g\" (UniqueName: \"kubernetes.io/projected/ce0ba05c-0aea-424d-973a-6a90f9f85683-kube-api-access-9wd7g\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.533723 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce0ba05c-0aea-424d-973a-6a90f9f85683-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.629746 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56fb7846b-ms2zg"] Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.630007 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56fb7846b-ms2zg" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api-log" containerID="cri-o://873478a5556be3bed25bddaf8ad90b31f8b27f29235630d364b4dd778164e138" gracePeriod=30 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.630161 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56fb7846b-ms2zg" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api" containerID="cri-o://1c6841d048b400b12515dc52640dc6fd7a71fbeef15df148fc954c73be47422f" gracePeriod=30 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.919934 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerStarted","Data":"2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e"} Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.920085 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-central-agent" containerID="cri-o://187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0" gracePeriod=30 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.920352 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.920593 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="proxy-httpd" containerID="cri-o://2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e" gracePeriod=30 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.920638 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="sg-core" containerID="cri-o://df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c" gracePeriod=30 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.920670 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-notification-agent" containerID="cri-o://ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836" gracePeriod=30 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.936569 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.936934 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl" event={"ID":"b6931ea0-428f-4f4a-991b-532c8064542f","Type":"ContainerDied","Data":"c1154dc7a352b71d4c6fab596572415d43429e52d8b4cc11e4d1a4cdcfea4ebb"} Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.936969 4786 scope.go:117] "RemoveContainer" containerID="838d3467d00145949ebefd73a9d755950c6efb1c51fcebd7cafa505f4c06de12" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.941439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" event={"ID":"ffcc2849-7c78-4b8d-a0ec-e629c3556a6b","Type":"ContainerDied","Data":"3434b1e26cfefd8c4af479ef69443b0975fa36cb1747c9c56ffc31e4c54f847c"} Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.941467 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd49cc99-szjf2" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.944271 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64887cd695-qvt7h" Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.944678 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64887cd695-qvt7h" event={"ID":"ce0ba05c-0aea-424d-973a-6a90f9f85683","Type":"ContainerDied","Data":"ad7cfbc827199609fd58d71ed26980539647359b553b7709c20c3a3ad73ed89c"} Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.954566 4786 generic.go:334] "Generic (PLEG): container finished" podID="21631dfa-11bd-41ad-a325-7c4136be967c" containerID="873478a5556be3bed25bddaf8ad90b31f8b27f29235630d364b4dd778164e138" exitCode=143 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.954685 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56fb7846b-ms2zg" event={"ID":"21631dfa-11bd-41ad-a325-7c4136be967c","Type":"ContainerDied","Data":"873478a5556be3bed25bddaf8ad90b31f8b27f29235630d364b4dd778164e138"} Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.959122 4786 generic.go:334] "Generic (PLEG): container finished" podID="6d0fe660-4646-4b25-b5b6-b24989d78be4" containerID="7d26ec1d9ece2a37cced85d6beca1eecc881de15084de4e7ac289a248517ef2c" exitCode=0 Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.959285 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-59gr5" event={"ID":"6d0fe660-4646-4b25-b5b6-b24989d78be4","Type":"ContainerDied","Data":"7d26ec1d9ece2a37cced85d6beca1eecc881de15084de4e7ac289a248517ef2c"} Mar 13 12:10:23 crc kubenswrapper[4786]: I0313 12:10:23.969841 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.042835133 podStartE2EDuration="46.969821026s" podCreationTimestamp="2026-03-13 12:09:37 +0000 UTC" firstStartedPulling="2026-03-13 12:09:38.50317039 +0000 UTC m=+1365.782823837" lastFinishedPulling="2026-03-13 12:10:23.430156283 +0000 UTC m=+1410.709809730" observedRunningTime="2026-03-13 12:10:23.94604468 +0000 UTC m=+1411.225698137" watchObservedRunningTime="2026-03-13 12:10:23.969821026 +0000 UTC m=+1411.249474483" Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.003194 4786 scope.go:117] "RemoveContainer" containerID="ba1fe6100258a62b71d3a9bd554107532f7443518a494f382e2a2ba04fc4c601" Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.009336 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd49cc99-szjf2"] Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.023132 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fd49cc99-szjf2"] Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.037974 4786 scope.go:117] "RemoveContainer" containerID="c5588970471f8b564c9ea9a71679e58b680a67f6302cf48e6bf143ac427f7437" Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.038141 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl"] Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.055483 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5d7f9bf5db-bwzfl"] Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.065958 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-64887cd695-qvt7h"] Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.073501 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-64887cd695-qvt7h"] Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.103166 4786 scope.go:117] "RemoveContainer" containerID="73e9aa5fef6e5eca89e38f358224f2e80d0cb67879f44375d4c172979300e66a" Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.137686 4786 scope.go:117] "RemoveContainer" containerID="9a2070cc35eb3f535bb3d81baf3135980c55ed9fea89cda596f2eb1d0b82033a" Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.161038 4786 scope.go:117] "RemoveContainer" containerID="b8a5d83e291bea4c8a9ab51d11d47cae0ee27095627ce55a3ffad47a97e3ced8" Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.971424 4786 generic.go:334] "Generic (PLEG): container finished" podID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerID="2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e" exitCode=0 Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.971750 4786 generic.go:334] "Generic (PLEG): container finished" podID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerID="df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c" exitCode=2 Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.971763 4786 generic.go:334] "Generic (PLEG): container finished" podID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerID="187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0" exitCode=0 Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.971507 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerDied","Data":"2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e"} Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.971843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerDied","Data":"df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c"} Mar 13 12:10:24 crc kubenswrapper[4786]: I0313 12:10:24.971862 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerDied","Data":"187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0"} Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.329276 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-59gr5" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.363367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-scripts\") pod \"6d0fe660-4646-4b25-b5b6-b24989d78be4\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.363487 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-config-data\") pod \"6d0fe660-4646-4b25-b5b6-b24989d78be4\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.363650 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghrg5\" (UniqueName: \"kubernetes.io/projected/6d0fe660-4646-4b25-b5b6-b24989d78be4-kube-api-access-ghrg5\") pod \"6d0fe660-4646-4b25-b5b6-b24989d78be4\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.363729 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-db-sync-config-data\") pod \"6d0fe660-4646-4b25-b5b6-b24989d78be4\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.363809 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-combined-ca-bundle\") pod \"6d0fe660-4646-4b25-b5b6-b24989d78be4\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.364014 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d0fe660-4646-4b25-b5b6-b24989d78be4-etc-machine-id\") pod \"6d0fe660-4646-4b25-b5b6-b24989d78be4\" (UID: \"6d0fe660-4646-4b25-b5b6-b24989d78be4\") " Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.364849 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d0fe660-4646-4b25-b5b6-b24989d78be4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6d0fe660-4646-4b25-b5b6-b24989d78be4" (UID: "6d0fe660-4646-4b25-b5b6-b24989d78be4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.369493 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d0fe660-4646-4b25-b5b6-b24989d78be4" (UID: "6d0fe660-4646-4b25-b5b6-b24989d78be4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.372608 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0fe660-4646-4b25-b5b6-b24989d78be4-kube-api-access-ghrg5" (OuterVolumeSpecName: "kube-api-access-ghrg5") pod "6d0fe660-4646-4b25-b5b6-b24989d78be4" (UID: "6d0fe660-4646-4b25-b5b6-b24989d78be4"). InnerVolumeSpecName "kube-api-access-ghrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.374778 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-scripts" (OuterVolumeSpecName: "scripts") pod "6d0fe660-4646-4b25-b5b6-b24989d78be4" (UID: "6d0fe660-4646-4b25-b5b6-b24989d78be4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.400033 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d0fe660-4646-4b25-b5b6-b24989d78be4" (UID: "6d0fe660-4646-4b25-b5b6-b24989d78be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.436495 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-config-data" (OuterVolumeSpecName: "config-data") pod "6d0fe660-4646-4b25-b5b6-b24989d78be4" (UID: "6d0fe660-4646-4b25-b5b6-b24989d78be4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.449654 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" path="/var/lib/kubelet/pods/b6931ea0-428f-4f4a-991b-532c8064542f/volumes" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.450375 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" path="/var/lib/kubelet/pods/ce0ba05c-0aea-424d-973a-6a90f9f85683/volumes" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.451067 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" path="/var/lib/kubelet/pods/ffcc2849-7c78-4b8d-a0ec-e629c3556a6b/volumes" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.465768 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghrg5\" (UniqueName: \"kubernetes.io/projected/6d0fe660-4646-4b25-b5b6-b24989d78be4-kube-api-access-ghrg5\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.465792 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.465802 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.465810 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d0fe660-4646-4b25-b5b6-b24989d78be4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.465818 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.465826 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0fe660-4646-4b25-b5b6-b24989d78be4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.989689 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-59gr5" event={"ID":"6d0fe660-4646-4b25-b5b6-b24989d78be4","Type":"ContainerDied","Data":"bc480304a1fe5865f4caeba93f47411f510feb6f839716604f1aafa45381d28f"} Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.989731 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc480304a1fe5865f4caeba93f47411f510feb6f839716604f1aafa45381d28f" Mar 13 12:10:25 crc kubenswrapper[4786]: I0313 12:10:25.990759 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-59gr5" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.360149 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371582 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="dnsmasq-dns" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371616 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="dnsmasq-dns" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371640 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371647 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371666 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker-log" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371672 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker-log" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371693 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda617df-8768-489a-8bbd-5876eb587961" containerName="dnsmasq-dns" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371698 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda617df-8768-489a-8bbd-5876eb587961" containerName="dnsmasq-dns" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371714 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda617df-8768-489a-8bbd-5876eb587961" containerName="init" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371719 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda617df-8768-489a-8bbd-5876eb587961" containerName="init" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371735 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="init" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371740 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="init" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371755 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener-log" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371761 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener-log" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371782 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371788 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker" Mar 13 12:10:26 crc kubenswrapper[4786]: E0313 12:10:26.371798 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0fe660-4646-4b25-b5b6-b24989d78be4" containerName="cinder-db-sync" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.371805 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0fe660-4646-4b25-b5b6-b24989d78be4" containerName="cinder-db-sync" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372070 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda617df-8768-489a-8bbd-5876eb587961" containerName="dnsmasq-dns" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372085 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker-log" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372093 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffcc2849-7c78-4b8d-a0ec-e629c3556a6b" containerName="dnsmasq-dns" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372104 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener-log" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372114 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0fe660-4646-4b25-b5b6-b24989d78be4" containerName="cinder-db-sync" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372126 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6931ea0-428f-4f4a-991b-532c8064542f" containerName="barbican-keystone-listener" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.372136 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0ba05c-0aea-424d-973a-6a90f9f85683" containerName="barbican-worker" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.373145 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.381729 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.381974 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.382108 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8pp56" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.383412 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.459969 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.489609 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03a4f906-25da-4780-988b-444065d26080-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.489657 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.489870 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.490046 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.490090 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-scripts\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.490109 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtvxk\" (UniqueName: \"kubernetes.io/projected/03a4f906-25da-4780-988b-444065d26080-kube-api-access-dtvxk\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.493531 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-5628g"] Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.497002 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.500984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-5628g"] Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591595 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-swift-storage-0\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591668 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591711 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-scripts\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591736 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvxk\" (UniqueName: \"kubernetes.io/projected/03a4f906-25da-4780-988b-444065d26080-kube-api-access-dtvxk\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591776 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-svc\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591804 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/018af6b4-883b-4532-a65f-58f8b6e00b39-kube-api-access-m8gkl\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591898 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-sb\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03a4f906-25da-4780-988b-444065d26080-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.591950 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.592008 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-config\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.592061 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-nb\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.592091 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.594143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03a4f906-25da-4780-988b-444065d26080-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.601320 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.602620 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.602754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.603191 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.603266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-scripts\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.606056 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.611129 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.620672 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvxk\" (UniqueName: \"kubernetes.io/projected/03a4f906-25da-4780-988b-444065d26080-kube-api-access-dtvxk\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.621096 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693226 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-scripts\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-swift-storage-0\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693299 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693430 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693477 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-svc\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693519 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/018af6b4-883b-4532-a65f-58f8b6e00b39-kube-api-access-m8gkl\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693615 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693643 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693676 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-sb\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-config\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.693973 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-logs\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wm7\" (UniqueName: \"kubernetes.io/projected/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-kube-api-access-x4wm7\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694050 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-nb\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694350 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-swift-storage-0\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694536 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-svc\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694688 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-sb\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694762 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-nb\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.694898 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-config\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.698772 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.726904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/018af6b4-883b-4532-a65f-58f8b6e00b39-kube-api-access-m8gkl\") pod \"dnsmasq-dns-5547746bbf-5628g\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.794068 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56fb7846b-ms2zg" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:49822->10.217.0.165:9311: read: connection reset by peer" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.794065 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56fb7846b-ms2zg" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:49806->10.217.0.165:9311: read: connection reset by peer" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795608 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-scripts\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795659 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795713 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795805 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795898 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-logs\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.795937 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wm7\" (UniqueName: \"kubernetes.io/projected/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-kube-api-access-x4wm7\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.796132 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.796487 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-logs\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.799941 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-scripts\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.800918 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.803134 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.803678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.826027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wm7\" (UniqueName: \"kubernetes.io/projected/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-kube-api-access-x4wm7\") pod \"cinder-api-0\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " pod="openstack/cinder-api-0" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.826417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:26 crc kubenswrapper[4786]: I0313 12:10:26.987113 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.002933 4786 generic.go:334] "Generic (PLEG): container finished" podID="21631dfa-11bd-41ad-a325-7c4136be967c" containerID="1c6841d048b400b12515dc52640dc6fd7a71fbeef15df148fc954c73be47422f" exitCode=0 Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.002978 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56fb7846b-ms2zg" event={"ID":"21631dfa-11bd-41ad-a325-7c4136be967c","Type":"ContainerDied","Data":"1c6841d048b400b12515dc52640dc6fd7a71fbeef15df148fc954c73be47422f"} Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.284149 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:27 crc kubenswrapper[4786]: W0313 12:10:27.288913 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03a4f906_25da_4780_988b_444065d26080.slice/crio-3df41ccf0150da0c0b0edb7dcded8403c5b0d8a47af58b1e62c4c4030bd730aa WatchSource:0}: Error finding container 3df41ccf0150da0c0b0edb7dcded8403c5b0d8a47af58b1e62c4c4030bd730aa: Status 404 returned error can't find the container with id 3df41ccf0150da0c0b0edb7dcded8403c5b0d8a47af58b1e62c4c4030bd730aa Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.397696 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-5628g"] Mar 13 12:10:27 crc kubenswrapper[4786]: W0313 12:10:27.401102 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod018af6b4_883b_4532_a65f_58f8b6e00b39.slice/crio-d488ede029ccab7ae0016686e8917bf4d74c3909fcf86e27293438eb9bb3b67a WatchSource:0}: Error finding container d488ede029ccab7ae0016686e8917bf4d74c3909fcf86e27293438eb9bb3b67a: Status 404 returned error can't find the container with id d488ede029ccab7ae0016686e8917bf4d74c3909fcf86e27293438eb9bb3b67a Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.420115 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.514206 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data-custom\") pod \"21631dfa-11bd-41ad-a325-7c4136be967c\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.514252 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstq9\" (UniqueName: \"kubernetes.io/projected/21631dfa-11bd-41ad-a325-7c4136be967c-kube-api-access-bstq9\") pod \"21631dfa-11bd-41ad-a325-7c4136be967c\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.514439 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data\") pod \"21631dfa-11bd-41ad-a325-7c4136be967c\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.514503 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-combined-ca-bundle\") pod \"21631dfa-11bd-41ad-a325-7c4136be967c\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.514544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21631dfa-11bd-41ad-a325-7c4136be967c-logs\") pod \"21631dfa-11bd-41ad-a325-7c4136be967c\" (UID: \"21631dfa-11bd-41ad-a325-7c4136be967c\") " Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.515506 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21631dfa-11bd-41ad-a325-7c4136be967c-logs" (OuterVolumeSpecName: "logs") pod "21631dfa-11bd-41ad-a325-7c4136be967c" (UID: "21631dfa-11bd-41ad-a325-7c4136be967c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.518166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21631dfa-11bd-41ad-a325-7c4136be967c-kube-api-access-bstq9" (OuterVolumeSpecName: "kube-api-access-bstq9") pod "21631dfa-11bd-41ad-a325-7c4136be967c" (UID: "21631dfa-11bd-41ad-a325-7c4136be967c"). InnerVolumeSpecName "kube-api-access-bstq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.518289 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "21631dfa-11bd-41ad-a325-7c4136be967c" (UID: "21631dfa-11bd-41ad-a325-7c4136be967c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.546951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21631dfa-11bd-41ad-a325-7c4136be967c" (UID: "21631dfa-11bd-41ad-a325-7c4136be967c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.572103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data" (OuterVolumeSpecName: "config-data") pod "21631dfa-11bd-41ad-a325-7c4136be967c" (UID: "21631dfa-11bd-41ad-a325-7c4136be967c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.616061 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.616089 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.616100 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21631dfa-11bd-41ad-a325-7c4136be967c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.616108 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21631dfa-11bd-41ad-a325-7c4136be967c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.616116 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstq9\" (UniqueName: \"kubernetes.io/projected/21631dfa-11bd-41ad-a325-7c4136be967c-kube-api-access-bstq9\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:27 crc kubenswrapper[4786]: W0313 12:10:27.634517 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b40b7ee_291d_4e92_bf9e_6c7167fc82b4.slice/crio-9a94861e22e1c162f3f2a9813bf18c2aab8bd5e2ee1a7cdfa08d1d003048307e WatchSource:0}: Error finding container 9a94861e22e1c162f3f2a9813bf18c2aab8bd5e2ee1a7cdfa08d1d003048307e: Status 404 returned error can't find the container with id 9a94861e22e1c162f3f2a9813bf18c2aab8bd5e2ee1a7cdfa08d1d003048307e Mar 13 12:10:27 crc kubenswrapper[4786]: I0313 12:10:27.640197 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.016046 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56fb7846b-ms2zg" Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.016035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56fb7846b-ms2zg" event={"ID":"21631dfa-11bd-41ad-a325-7c4136be967c","Type":"ContainerDied","Data":"36ccdf4d3e91c690af3c00b001b3aa77d535f66ff901e8e4ce98e41d376668c3"} Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.016130 4786 scope.go:117] "RemoveContainer" containerID="1c6841d048b400b12515dc52640dc6fd7a71fbeef15df148fc954c73be47422f" Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.017940 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03a4f906-25da-4780-988b-444065d26080","Type":"ContainerStarted","Data":"3df41ccf0150da0c0b0edb7dcded8403c5b0d8a47af58b1e62c4c4030bd730aa"} Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.023541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4","Type":"ContainerStarted","Data":"9a94861e22e1c162f3f2a9813bf18c2aab8bd5e2ee1a7cdfa08d1d003048307e"} Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.025147 4786 generic.go:334] "Generic (PLEG): container finished" podID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerID="19c9ff3e830a432f46800afce68efbb91d132b1e93d7254352532633446da752" exitCode=0 Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.025200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-5628g" event={"ID":"018af6b4-883b-4532-a65f-58f8b6e00b39","Type":"ContainerDied","Data":"19c9ff3e830a432f46800afce68efbb91d132b1e93d7254352532633446da752"} Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.025242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-5628g" event={"ID":"018af6b4-883b-4532-a65f-58f8b6e00b39","Type":"ContainerStarted","Data":"d488ede029ccab7ae0016686e8917bf4d74c3909fcf86e27293438eb9bb3b67a"} Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.068430 4786 scope.go:117] "RemoveContainer" containerID="873478a5556be3bed25bddaf8ad90b31f8b27f29235630d364b4dd778164e138" Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.078821 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56fb7846b-ms2zg"] Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.088002 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56fb7846b-ms2zg"] Mar 13 12:10:28 crc kubenswrapper[4786]: I0313 12:10:28.810916 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.039391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-5628g" event={"ID":"018af6b4-883b-4532-a65f-58f8b6e00b39","Type":"ContainerStarted","Data":"2e436c466c2593e6599542d39b946854245f5133ce60a7d4917eab893b67f827"} Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.041034 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.049672 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4","Type":"ContainerStarted","Data":"d6c821a13cfd70e9ea5db3d268c4853e52894db491eb8d02e4ab2e18ad839755"} Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.067232 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5547746bbf-5628g" podStartSLOduration=3.067214427 podStartE2EDuration="3.067214427s" podCreationTimestamp="2026-03-13 12:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:29.066463876 +0000 UTC m=+1416.346117323" watchObservedRunningTime="2026-03-13 12:10:29.067214427 +0000 UTC m=+1416.346867874" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.456665 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" path="/var/lib/kubelet/pods/21631dfa-11bd-41ad-a325-7c4136be967c/volumes" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.569937 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.758423 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-config-data\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.758510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-sg-core-conf-yaml\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.758665 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2jw8\" (UniqueName: \"kubernetes.io/projected/da5e2292-690b-4774-833c-5823cfb8f6ca-kube-api-access-p2jw8\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.759379 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-scripts\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.759428 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-log-httpd\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.759565 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-run-httpd\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.759600 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-combined-ca-bundle\") pod \"da5e2292-690b-4774-833c-5823cfb8f6ca\" (UID: \"da5e2292-690b-4774-833c-5823cfb8f6ca\") " Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.760047 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.760140 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.760306 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.760320 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5e2292-690b-4774-833c-5823cfb8f6ca-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.762294 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5e2292-690b-4774-833c-5823cfb8f6ca-kube-api-access-p2jw8" (OuterVolumeSpecName: "kube-api-access-p2jw8") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "kube-api-access-p2jw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.768042 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-scripts" (OuterVolumeSpecName: "scripts") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.786590 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.832087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.861484 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.861515 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2jw8\" (UniqueName: \"kubernetes.io/projected/da5e2292-690b-4774-833c-5823cfb8f6ca-kube-api-access-p2jw8\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.861526 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.861534 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.877324 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-config-data" (OuterVolumeSpecName: "config-data") pod "da5e2292-690b-4774-833c-5823cfb8f6ca" (UID: "da5e2292-690b-4774-833c-5823cfb8f6ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:29 crc kubenswrapper[4786]: I0313 12:10:29.962060 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5e2292-690b-4774-833c-5823cfb8f6ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.059275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4","Type":"ContainerStarted","Data":"72b3285ba544711e76b3e46aa28b9b2795ea7c731a6135be3ef16621d7c38908"} Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.059443 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api-log" containerID="cri-o://d6c821a13cfd70e9ea5db3d268c4853e52894db491eb8d02e4ab2e18ad839755" gracePeriod=30 Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.059511 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api" containerID="cri-o://72b3285ba544711e76b3e46aa28b9b2795ea7c731a6135be3ef16621d7c38908" gracePeriod=30 Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.059557 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.062182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03a4f906-25da-4780-988b-444065d26080","Type":"ContainerStarted","Data":"f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c"} Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.071470 4786 generic.go:334] "Generic (PLEG): container finished" podID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerID="ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836" exitCode=0 Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.071534 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.071537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerDied","Data":"ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836"} Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.071987 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5e2292-690b-4774-833c-5823cfb8f6ca","Type":"ContainerDied","Data":"b6145fcf8363eb1b8be26b0e9be40f06c0ecdde5f2973f5a3d5a25ea5b9501b5"} Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.072013 4786 scope.go:117] "RemoveContainer" containerID="2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.084605 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.084581792 podStartE2EDuration="4.084581792s" podCreationTimestamp="2026-03-13 12:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:30.077247663 +0000 UTC m=+1417.356901130" watchObservedRunningTime="2026-03-13 12:10:30.084581792 +0000 UTC m=+1417.364235249" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.097625 4786 scope.go:117] "RemoveContainer" containerID="df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.137079 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.147048 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.147090 4786 scope.go:117] "RemoveContainer" containerID="ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.168896 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.169286 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="sg-core" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169297 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="sg-core" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.169319 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-notification-agent" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169325 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-notification-agent" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.169335 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169341 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.169351 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api-log" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169357 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api-log" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.169370 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="proxy-httpd" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169376 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="proxy-httpd" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.169389 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-central-agent" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169395 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-central-agent" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169549 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="sg-core" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169569 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-notification-agent" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169580 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="proxy-httpd" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169589 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169605 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="21631dfa-11bd-41ad-a325-7c4136be967c" containerName="barbican-api-log" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.169613 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" containerName="ceilometer-central-agent" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.172047 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.177705 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.178301 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.198913 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.232075 4786 scope.go:117] "RemoveContainer" containerID="187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.251834 4786 scope.go:117] "RemoveContainer" containerID="2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.252527 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e\": container with ID starting with 2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e not found: ID does not exist" containerID="2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.252591 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e"} err="failed to get container status \"2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e\": rpc error: code = NotFound desc = could not find container \"2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e\": container with ID starting with 2b7b536ea518c02dfe9504376bada9a0bca2b9bddd0a461d137232dcdd63881e not found: ID does not exist" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.252615 4786 scope.go:117] "RemoveContainer" containerID="df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.253358 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c\": container with ID starting with df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c not found: ID does not exist" containerID="df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.253380 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c"} err="failed to get container status \"df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c\": rpc error: code = NotFound desc = could not find container \"df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c\": container with ID starting with df5069448cd44b82a1d3d80e32474b4c24d623c045c1d4fce24e99d456e2441c not found: ID does not exist" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.253394 4786 scope.go:117] "RemoveContainer" containerID="ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.253761 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836\": container with ID starting with ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836 not found: ID does not exist" containerID="ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.253806 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836"} err="failed to get container status \"ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836\": rpc error: code = NotFound desc = could not find container \"ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836\": container with ID starting with ca47278d1ae0d54344cd2a82f90fe721376f250c76830631a36ba774e7f1f836 not found: ID does not exist" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.253834 4786 scope.go:117] "RemoveContainer" containerID="187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0" Mar 13 12:10:30 crc kubenswrapper[4786]: E0313 12:10:30.254174 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0\": container with ID starting with 187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0 not found: ID does not exist" containerID="187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.254204 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0"} err="failed to get container status \"187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0\": rpc error: code = NotFound desc = could not find container \"187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0\": container with ID starting with 187447ce3ea2b4f203373161cc811cdf77452e3df27e4088bbe9ae967adb83a0 not found: ID does not exist" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-log-httpd\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vj2\" (UniqueName: \"kubernetes.io/projected/b0b43d76-6530-436e-9c8f-a3de4193f997-kube-api-access-29vj2\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-scripts\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374702 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374730 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-config-data\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.374757 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-run-httpd\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.477682 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-log-httpd\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.477737 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29vj2\" (UniqueName: \"kubernetes.io/projected/b0b43d76-6530-436e-9c8f-a3de4193f997-kube-api-access-29vj2\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.477814 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.477957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.477982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-scripts\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.478023 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-config-data\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.478054 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-run-httpd\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.478401 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-log-httpd\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.478546 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-run-httpd\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.483221 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.484110 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.486096 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-config-data\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.491431 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-scripts\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.496666 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vj2\" (UniqueName: \"kubernetes.io/projected/b0b43d76-6530-436e-9c8f-a3de4193f997-kube-api-access-29vj2\") pod \"ceilometer-0\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " pod="openstack/ceilometer-0" Mar 13 12:10:30 crc kubenswrapper[4786]: I0313 12:10:30.510330 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:31 crc kubenswrapper[4786]: W0313 12:10:31.020822 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0b43d76_6530_436e_9c8f_a3de4193f997.slice/crio-245bd61bd0c189010f4b671cc24741f1c33d6871f16ded771881cf02ac482201 WatchSource:0}: Error finding container 245bd61bd0c189010f4b671cc24741f1c33d6871f16ded771881cf02ac482201: Status 404 returned error can't find the container with id 245bd61bd0c189010f4b671cc24741f1c33d6871f16ded771881cf02ac482201 Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.021403 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.084652 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerStarted","Data":"245bd61bd0c189010f4b671cc24741f1c33d6871f16ded771881cf02ac482201"} Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.089997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03a4f906-25da-4780-988b-444065d26080","Type":"ContainerStarted","Data":"078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade"} Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.095278 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerID="72b3285ba544711e76b3e46aa28b9b2795ea7c731a6135be3ef16621d7c38908" exitCode=0 Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.095308 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerID="d6c821a13cfd70e9ea5db3d268c4853e52894db491eb8d02e4ab2e18ad839755" exitCode=143 Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.095426 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4","Type":"ContainerDied","Data":"72b3285ba544711e76b3e46aa28b9b2795ea7c731a6135be3ef16621d7c38908"} Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.095505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4","Type":"ContainerDied","Data":"d6c821a13cfd70e9ea5db3d268c4853e52894db491eb8d02e4ab2e18ad839755"} Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.113862 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.69133876 podStartE2EDuration="5.113824658s" podCreationTimestamp="2026-03-13 12:10:26 +0000 UTC" firstStartedPulling="2026-03-13 12:10:27.291181896 +0000 UTC m=+1414.570835343" lastFinishedPulling="2026-03-13 12:10:28.713667794 +0000 UTC m=+1415.993321241" observedRunningTime="2026-03-13 12:10:31.111604558 +0000 UTC m=+1418.391258055" watchObservedRunningTime="2026-03-13 12:10:31.113824658 +0000 UTC m=+1418.393478175" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.180851 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.293687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4wm7\" (UniqueName: \"kubernetes.io/projected/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-kube-api-access-x4wm7\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297238 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-logs\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297362 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297386 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-etc-machine-id\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297518 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-scripts\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297563 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data-custom\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297612 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-combined-ca-bundle\") pod \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\" (UID: \"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4\") " Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297676 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-logs" (OuterVolumeSpecName: "logs") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.297733 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.300544 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.300564 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.313130 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.314191 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-scripts" (OuterVolumeSpecName: "scripts") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.324420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-kube-api-access-x4wm7" (OuterVolumeSpecName: "kube-api-access-x4wm7") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "kube-api-access-x4wm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.338763 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.358058 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data" (OuterVolumeSpecName: "config-data") pod "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" (UID: "6b40b7ee-291d-4e92-bf9e-6c7167fc82b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.402053 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.402095 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.402111 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.402123 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4wm7\" (UniqueName: \"kubernetes.io/projected/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-kube-api-access-x4wm7\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.402135 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.452597 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5e2292-690b-4774-833c-5823cfb8f6ca" path="/var/lib/kubelet/pods/da5e2292-690b-4774-833c-5823cfb8f6ca/volumes" Mar 13 12:10:31 crc kubenswrapper[4786]: I0313 12:10:31.698965 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.124949 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b40b7ee-291d-4e92-bf9e-6c7167fc82b4","Type":"ContainerDied","Data":"9a94861e22e1c162f3f2a9813bf18c2aab8bd5e2ee1a7cdfa08d1d003048307e"} Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.124967 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.125316 4786 scope.go:117] "RemoveContainer" containerID="72b3285ba544711e76b3e46aa28b9b2795ea7c731a6135be3ef16621d7c38908" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.130272 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerStarted","Data":"4ca295232845e30ee7782cbbc8f944cef3023c0f8b4f8fed9aa8cbb56a2c0002"} Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.152232 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.163074 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.164205 4786 scope.go:117] "RemoveContainer" containerID="d6c821a13cfd70e9ea5db3d268c4853e52894db491eb8d02e4ab2e18ad839755" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.178537 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:32 crc kubenswrapper[4786]: E0313 12:10:32.179059 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api-log" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.179087 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api-log" Mar 13 12:10:32 crc kubenswrapper[4786]: E0313 12:10:32.179140 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.179153 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.179424 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.179461 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" containerName="cinder-api-log" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.182049 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.187339 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.188352 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.188666 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.212509 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.219383 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.219449 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa75843b-0c7d-49c1-be09-bef85ec8fd16-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.219687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-scripts\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.219973 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.220089 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data-custom\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.220128 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa75843b-0c7d-49c1-be09-bef85ec8fd16-logs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.220245 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.220308 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxxn\" (UniqueName: \"kubernetes.io/projected/aa75843b-0c7d-49c1-be09-bef85ec8fd16-kube-api-access-wrxxn\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.220351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-public-tls-certs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321488 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321536 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa75843b-0c7d-49c1-be09-bef85ec8fd16-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321572 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-scripts\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321612 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321646 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data-custom\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321662 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa75843b-0c7d-49c1-be09-bef85ec8fd16-logs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321696 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321712 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxxn\" (UniqueName: \"kubernetes.io/projected/aa75843b-0c7d-49c1-be09-bef85ec8fd16-kube-api-access-wrxxn\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.321729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-public-tls-certs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.322426 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa75843b-0c7d-49c1-be09-bef85ec8fd16-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.325760 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa75843b-0c7d-49c1-be09-bef85ec8fd16-logs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.326230 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-public-tls-certs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.326586 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-scripts\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.326721 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.326788 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data-custom\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.327423 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.340482 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.347756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxxn\" (UniqueName: \"kubernetes.io/projected/aa75843b-0c7d-49c1-be09-bef85ec8fd16-kube-api-access-wrxxn\") pod \"cinder-api-0\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " pod="openstack/cinder-api-0" Mar 13 12:10:32 crc kubenswrapper[4786]: I0313 12:10:32.516183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:10:33 crc kubenswrapper[4786]: I0313 12:10:33.055050 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:10:33 crc kubenswrapper[4786]: I0313 12:10:33.181478 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerStarted","Data":"c503626f2b146a24b1400a3045566bf0eb8a620ea09bc42fc74d42629127a420"} Mar 13 12:10:33 crc kubenswrapper[4786]: I0313 12:10:33.184561 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aa75843b-0c7d-49c1-be09-bef85ec8fd16","Type":"ContainerStarted","Data":"760611395e73b3b83a10b25c81ff8ebd3281524b0deaab10939f310d0fc47f02"} Mar 13 12:10:33 crc kubenswrapper[4786]: I0313 12:10:33.457286 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b40b7ee-291d-4e92-bf9e-6c7167fc82b4" path="/var/lib/kubelet/pods/6b40b7ee-291d-4e92-bf9e-6c7167fc82b4/volumes" Mar 13 12:10:34 crc kubenswrapper[4786]: I0313 12:10:34.211924 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aa75843b-0c7d-49c1-be09-bef85ec8fd16","Type":"ContainerStarted","Data":"9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410"} Mar 13 12:10:34 crc kubenswrapper[4786]: I0313 12:10:34.217613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerStarted","Data":"775ce4a56412c18bd9b4661ea4c62fb2c2c8e96fc8399268d2b2a0646a581b94"} Mar 13 12:10:34 crc kubenswrapper[4786]: I0313 12:10:34.902389 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.144167 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cf57c7fc-2x5rg"] Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.144404 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cf57c7fc-2x5rg" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-api" containerID="cri-o://8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720" gracePeriod=30 Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.144530 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cf57c7fc-2x5rg" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-httpd" containerID="cri-o://2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd" gracePeriod=30 Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.157821 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5cf57c7fc-2x5rg" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": read tcp 10.217.0.2:39040->10.217.0.157:9696: read: connection reset by peer" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.165385 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6577bdf497-p2bmr"] Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.167020 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.182870 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqdqz\" (UniqueName: \"kubernetes.io/projected/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-kube-api-access-tqdqz\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.182962 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-httpd-config\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.182994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-public-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.183014 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-ovndb-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.183055 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-combined-ca-bundle\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.183099 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-internal-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.183117 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-config\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.188791 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6577bdf497-p2bmr"] Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.228005 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerStarted","Data":"4b2785eb233bfc907642218cd7bf45b2c34261572582f57a5ec0d4f4cd2ab067"} Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.229183 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.230723 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aa75843b-0c7d-49c1-be09-bef85ec8fd16","Type":"ContainerStarted","Data":"be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a"} Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.231528 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqdqz\" (UniqueName: \"kubernetes.io/projected/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-kube-api-access-tqdqz\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-httpd-config\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-public-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288578 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-ovndb-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288668 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-combined-ca-bundle\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288754 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-internal-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.288777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-config\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.302786 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-combined-ca-bundle\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.305426 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-internal-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.306041 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-httpd-config\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.306650 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-ovndb-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.307448 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-public-tls-certs\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.311565 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-config\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.327787 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqdqz\" (UniqueName: \"kubernetes.io/projected/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-kube-api-access-tqdqz\") pod \"neutron-6577bdf497-p2bmr\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.340747 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.543801605 podStartE2EDuration="5.340724429s" podCreationTimestamp="2026-03-13 12:10:30 +0000 UTC" firstStartedPulling="2026-03-13 12:10:31.02323293 +0000 UTC m=+1418.302886387" lastFinishedPulling="2026-03-13 12:10:34.820155754 +0000 UTC m=+1422.099809211" observedRunningTime="2026-03-13 12:10:35.292954683 +0000 UTC m=+1422.572608140" watchObservedRunningTime="2026-03-13 12:10:35.340724429 +0000 UTC m=+1422.620377876" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.342016 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.342007944 podStartE2EDuration="3.342007944s" podCreationTimestamp="2026-03-13 12:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:35.32233679 +0000 UTC m=+1422.601990237" watchObservedRunningTime="2026-03-13 12:10:35.342007944 +0000 UTC m=+1422.621661391" Mar 13 12:10:35 crc kubenswrapper[4786]: I0313 12:10:35.544139 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.090155 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6577bdf497-p2bmr"] Mar 13 12:10:36 crc kubenswrapper[4786]: W0313 12:10:36.110171 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0813a8e1_e94c_43ed_a0d9_fd3fdcb6660c.slice/crio-776ca0d624d9affa9d3bae1a7df7572c7458baa1c7d6fdfaffccc4d21f325dde WatchSource:0}: Error finding container 776ca0d624d9affa9d3bae1a7df7572c7458baa1c7d6fdfaffccc4d21f325dde: Status 404 returned error can't find the container with id 776ca0d624d9affa9d3bae1a7df7572c7458baa1c7d6fdfaffccc4d21f325dde Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.248582 4786 generic.go:334] "Generic (PLEG): container finished" podID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerID="2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd" exitCode=0 Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.248789 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf57c7fc-2x5rg" event={"ID":"1664e190-3182-45ed-8365-e4ac4dccf4cd","Type":"ContainerDied","Data":"2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd"} Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.250169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6577bdf497-p2bmr" event={"ID":"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c","Type":"ContainerStarted","Data":"776ca0d624d9affa9d3bae1a7df7572c7458baa1c7d6fdfaffccc4d21f325dde"} Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.828003 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.905335 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-7dk2d"] Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.905617 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerName="dnsmasq-dns" containerID="cri-o://c521db66dc54a5126f8b6cb3105faf44beb7753db75d2421f365e41e76b99123" gracePeriod=10 Mar 13 12:10:36 crc kubenswrapper[4786]: I0313 12:10:36.968254 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.020486 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.271223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6577bdf497-p2bmr" event={"ID":"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c","Type":"ContainerStarted","Data":"617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd"} Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.271641 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6577bdf497-p2bmr" event={"ID":"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c","Type":"ContainerStarted","Data":"1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555"} Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.271969 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.282721 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerID="c521db66dc54a5126f8b6cb3105faf44beb7753db75d2421f365e41e76b99123" exitCode=0 Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.282964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" event={"ID":"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10","Type":"ContainerDied","Data":"c521db66dc54a5126f8b6cb3105faf44beb7753db75d2421f365e41e76b99123"} Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.283114 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="cinder-scheduler" containerID="cri-o://f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c" gracePeriod=30 Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.283232 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="probe" containerID="cri-o://078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade" gracePeriod=30 Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.300042 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6577bdf497-p2bmr" podStartSLOduration=2.300016732 podStartE2EDuration="2.300016732s" podCreationTimestamp="2026-03-13 12:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:37.292536189 +0000 UTC m=+1424.572189656" watchObservedRunningTime="2026-03-13 12:10:37.300016732 +0000 UTC m=+1424.579670179" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.514774 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.559568 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5cf57c7fc-2x5rg" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.636519 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-swift-storage-0\") pod \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.636575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-sb\") pod \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.636654 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-svc\") pod \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.636740 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-nb\") pod \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.636773 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-config\") pod \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.636803 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmtqj\" (UniqueName: \"kubernetes.io/projected/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-kube-api-access-gmtqj\") pod \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\" (UID: \"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10\") " Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.645680 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-kube-api-access-gmtqj" (OuterVolumeSpecName: "kube-api-access-gmtqj") pod "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" (UID: "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10"). InnerVolumeSpecName "kube-api-access-gmtqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.702523 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" (UID: "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.708112 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" (UID: "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.714563 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" (UID: "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.722569 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-config" (OuterVolumeSpecName: "config") pod "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" (UID: "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.724388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" (UID: "2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.739113 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.739156 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.739170 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.739182 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.739250 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:37 crc kubenswrapper[4786]: I0313 12:10:37.739265 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmtqj\" (UniqueName: \"kubernetes.io/projected/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10-kube-api-access-gmtqj\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.169520 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.169982 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.226133 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.292665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" event={"ID":"2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10","Type":"ContainerDied","Data":"9c4ea8af88916bd9e0cec388cb8ef7f762338ed64db29b86f1f3984addebfed3"} Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.292686 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf8fb985-7dk2d" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.292720 4786 scope.go:117] "RemoveContainer" containerID="c521db66dc54a5126f8b6cb3105faf44beb7753db75d2421f365e41e76b99123" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.295683 4786 generic.go:334] "Generic (PLEG): container finished" podID="03a4f906-25da-4780-988b-444065d26080" containerID="078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade" exitCode=0 Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.295770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03a4f906-25da-4780-988b-444065d26080","Type":"ContainerDied","Data":"078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade"} Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.301567 4786 generic.go:334] "Generic (PLEG): container finished" podID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerID="8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720" exitCode=0 Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.301646 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf57c7fc-2x5rg" event={"ID":"1664e190-3182-45ed-8365-e4ac4dccf4cd","Type":"ContainerDied","Data":"8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720"} Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.301703 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf57c7fc-2x5rg" event={"ID":"1664e190-3182-45ed-8365-e4ac4dccf4cd","Type":"ContainerDied","Data":"bd75ec3fafeef3f51b860acac24b5b8c3973616b2a298d80c095605f9e9891f5"} Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.301711 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf57c7fc-2x5rg" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.344158 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-7dk2d"] Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.348165 4786 scope.go:117] "RemoveContainer" containerID="b058fdc20733428aa91371a37717688b5c9eb0c8f0b8b50f9e4ee0db1ba1c047" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351151 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-config\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351231 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-httpd-config\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351257 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-internal-tls-certs\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351291 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-combined-ca-bundle\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351342 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-public-tls-certs\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351454 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-ovndb-tls-certs\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.351517 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btzwz\" (UniqueName: \"kubernetes.io/projected/1664e190-3182-45ed-8365-e4ac4dccf4cd-kube-api-access-btzwz\") pod \"1664e190-3182-45ed-8365-e4ac4dccf4cd\" (UID: \"1664e190-3182-45ed-8365-e4ac4dccf4cd\") " Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.355406 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77cf8fb985-7dk2d"] Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.357833 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1664e190-3182-45ed-8365-e4ac4dccf4cd-kube-api-access-btzwz" (OuterVolumeSpecName: "kube-api-access-btzwz") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "kube-api-access-btzwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.362013 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.385240 4786 scope.go:117] "RemoveContainer" containerID="2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.424213 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.427560 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.427808 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-config" (OuterVolumeSpecName: "config") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.447061 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.453937 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btzwz\" (UniqueName: \"kubernetes.io/projected/1664e190-3182-45ed-8365-e4ac4dccf4cd-kube-api-access-btzwz\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.453963 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.453972 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.453982 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.453990 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.453998 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.459176 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1664e190-3182-45ed-8365-e4ac4dccf4cd" (UID: "1664e190-3182-45ed-8365-e4ac4dccf4cd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.495432 4786 scope.go:117] "RemoveContainer" containerID="8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.517514 4786 scope.go:117] "RemoveContainer" containerID="2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd" Mar 13 12:10:38 crc kubenswrapper[4786]: E0313 12:10:38.517902 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd\": container with ID starting with 2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd not found: ID does not exist" containerID="2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.518156 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd"} err="failed to get container status \"2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd\": rpc error: code = NotFound desc = could not find container \"2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd\": container with ID starting with 2e8750713ee522f126b1595c39a0d3b9994e5bf6a1f2f8aa7b5ca32ff6afdbcd not found: ID does not exist" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.518192 4786 scope.go:117] "RemoveContainer" containerID="8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720" Mar 13 12:10:38 crc kubenswrapper[4786]: E0313 12:10:38.518681 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720\": container with ID starting with 8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720 not found: ID does not exist" containerID="8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.518760 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720"} err="failed to get container status \"8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720\": rpc error: code = NotFound desc = could not find container \"8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720\": container with ID starting with 8eeffaf09d32425d7da6f0bed0e06a0b39055102e909e4bc65ab11411b3d1720 not found: ID does not exist" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.555492 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1664e190-3182-45ed-8365-e4ac4dccf4cd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.643140 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cf57c7fc-2x5rg"] Mar 13 12:10:38 crc kubenswrapper[4786]: I0313 12:10:38.651434 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5cf57c7fc-2x5rg"] Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.005494 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.020988 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250234 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-88c9bb894-zzsvv"] Mar 13 12:10:39 crc kubenswrapper[4786]: E0313 12:10:39.250596 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerName="dnsmasq-dns" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250612 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerName="dnsmasq-dns" Mar 13 12:10:39 crc kubenswrapper[4786]: E0313 12:10:39.250622 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-api" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250628 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-api" Mar 13 12:10:39 crc kubenswrapper[4786]: E0313 12:10:39.250639 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-httpd" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250646 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-httpd" Mar 13 12:10:39 crc kubenswrapper[4786]: E0313 12:10:39.250656 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerName="init" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250661 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerName="init" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250868 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" containerName="dnsmasq-dns" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250902 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-httpd" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.250923 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" containerName="neutron-api" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.251810 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.274504 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-88c9bb894-zzsvv"] Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.379463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-public-tls-certs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.379551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-config-data\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.379588 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-logs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.379698 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-scripts\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.379735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjz4\" (UniqueName: \"kubernetes.io/projected/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-kube-api-access-jcjz4\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.379953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-internal-tls-certs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.380065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-combined-ca-bundle\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.450625 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1664e190-3182-45ed-8365-e4ac4dccf4cd" path="/var/lib/kubelet/pods/1664e190-3182-45ed-8365-e4ac4dccf4cd/volumes" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.451467 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10" path="/var/lib/kubelet/pods/2ff7e13b-46a0-44e6-9a4d-ba2aac3e5e10/volumes" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-config-data\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-logs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-scripts\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjz4\" (UniqueName: \"kubernetes.io/projected/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-kube-api-access-jcjz4\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482683 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-internal-tls-certs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482739 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-combined-ca-bundle\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482785 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-public-tls-certs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.482936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-logs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.486316 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-scripts\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.486631 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-public-tls-certs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.486658 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-internal-tls-certs\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.487089 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-combined-ca-bundle\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.488717 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-config-data\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.506446 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjz4\" (UniqueName: \"kubernetes.io/projected/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-kube-api-access-jcjz4\") pod \"placement-88c9bb894-zzsvv\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:39 crc kubenswrapper[4786]: I0313 12:10:39.573427 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:40 crc kubenswrapper[4786]: I0313 12:10:40.083787 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-88c9bb894-zzsvv"] Mar 13 12:10:40 crc kubenswrapper[4786]: I0313 12:10:40.354316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88c9bb894-zzsvv" event={"ID":"b0d491ad-ee68-47bb-a1e3-66d22ecca41a","Type":"ContainerStarted","Data":"4c20a3a9efb58f7414d605e80484d3abb98eb80ede6f2e44180c4e58765fd222"} Mar 13 12:10:40 crc kubenswrapper[4786]: I0313 12:10:40.585831 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:10:40 crc kubenswrapper[4786]: I0313 12:10:40.970667 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019403 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtvxk\" (UniqueName: \"kubernetes.io/projected/03a4f906-25da-4780-988b-444065d26080-kube-api-access-dtvxk\") pod \"03a4f906-25da-4780-988b-444065d26080\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019473 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data-custom\") pod \"03a4f906-25da-4780-988b-444065d26080\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019493 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03a4f906-25da-4780-988b-444065d26080-etc-machine-id\") pod \"03a4f906-25da-4780-988b-444065d26080\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019545 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-scripts\") pod \"03a4f906-25da-4780-988b-444065d26080\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019593 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data\") pod \"03a4f906-25da-4780-988b-444065d26080\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019620 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-combined-ca-bundle\") pod \"03a4f906-25da-4780-988b-444065d26080\" (UID: \"03a4f906-25da-4780-988b-444065d26080\") " Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.019594 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03a4f906-25da-4780-988b-444065d26080-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "03a4f906-25da-4780-988b-444065d26080" (UID: "03a4f906-25da-4780-988b-444065d26080"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.020033 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03a4f906-25da-4780-988b-444065d26080-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.025305 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03a4f906-25da-4780-988b-444065d26080" (UID: "03a4f906-25da-4780-988b-444065d26080"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.025508 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a4f906-25da-4780-988b-444065d26080-kube-api-access-dtvxk" (OuterVolumeSpecName: "kube-api-access-dtvxk") pod "03a4f906-25da-4780-988b-444065d26080" (UID: "03a4f906-25da-4780-988b-444065d26080"). InnerVolumeSpecName "kube-api-access-dtvxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.030096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-scripts" (OuterVolumeSpecName: "scripts") pod "03a4f906-25da-4780-988b-444065d26080" (UID: "03a4f906-25da-4780-988b-444065d26080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.075459 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03a4f906-25da-4780-988b-444065d26080" (UID: "03a4f906-25da-4780-988b-444065d26080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.115036 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data" (OuterVolumeSpecName: "config-data") pod "03a4f906-25da-4780-988b-444065d26080" (UID: "03a4f906-25da-4780-988b-444065d26080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.122099 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtvxk\" (UniqueName: \"kubernetes.io/projected/03a4f906-25da-4780-988b-444065d26080-kube-api-access-dtvxk\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.122135 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.122148 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.122159 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.122171 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a4f906-25da-4780-988b-444065d26080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.369719 4786 generic.go:334] "Generic (PLEG): container finished" podID="03a4f906-25da-4780-988b-444065d26080" containerID="f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c" exitCode=0 Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.369782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03a4f906-25da-4780-988b-444065d26080","Type":"ContainerDied","Data":"f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c"} Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.369811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03a4f906-25da-4780-988b-444065d26080","Type":"ContainerDied","Data":"3df41ccf0150da0c0b0edb7dcded8403c5b0d8a47af58b1e62c4c4030bd730aa"} Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.369830 4786 scope.go:117] "RemoveContainer" containerID="078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.369987 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.380115 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88c9bb894-zzsvv" event={"ID":"b0d491ad-ee68-47bb-a1e3-66d22ecca41a","Type":"ContainerStarted","Data":"52dda96d5af850effe132c05ad903de42b92e2b4d4cba2475c20cf70be8ffde6"} Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.380376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88c9bb894-zzsvv" event={"ID":"b0d491ad-ee68-47bb-a1e3-66d22ecca41a","Type":"ContainerStarted","Data":"f34dd912eb47d002fd56518d38540c1994f7c17513d2933e712f79bc0fca64c8"} Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.381221 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.381253 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.407270 4786 scope.go:117] "RemoveContainer" containerID="f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.416182 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-88c9bb894-zzsvv" podStartSLOduration=2.416156217 podStartE2EDuration="2.416156217s" podCreationTimestamp="2026-03-13 12:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:41.403370211 +0000 UTC m=+1428.683023658" watchObservedRunningTime="2026-03-13 12:10:41.416156217 +0000 UTC m=+1428.695809664" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.463708 4786 scope.go:117] "RemoveContainer" containerID="078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.465081 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.465116 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:41 crc kubenswrapper[4786]: E0313 12:10:41.465190 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade\": container with ID starting with 078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade not found: ID does not exist" containerID="078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.465211 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade"} err="failed to get container status \"078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade\": rpc error: code = NotFound desc = could not find container \"078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade\": container with ID starting with 078eb4fc62802a62d8e8543e029e1a0ae153f5b22e63fc350bf5eaef32e97ade not found: ID does not exist" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.465234 4786 scope.go:117] "RemoveContainer" containerID="f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c" Mar 13 12:10:41 crc kubenswrapper[4786]: E0313 12:10:41.467870 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c\": container with ID starting with f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c not found: ID does not exist" containerID="f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.467934 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c"} err="failed to get container status \"f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c\": rpc error: code = NotFound desc = could not find container \"f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c\": container with ID starting with f059f695f20e2249bec8e952c0ef747f157021699825c42d94b1edbcdd789a9c not found: ID does not exist" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.474130 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:41 crc kubenswrapper[4786]: E0313 12:10:41.485330 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="probe" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.485358 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="probe" Mar 13 12:10:41 crc kubenswrapper[4786]: E0313 12:10:41.485385 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="cinder-scheduler" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.485392 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="cinder-scheduler" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.485637 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="cinder-scheduler" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.485659 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a4f906-25da-4780-988b-444065d26080" containerName="probe" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.486675 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.486763 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.492562 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.532065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.532188 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.532251 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pq6l\" (UniqueName: \"kubernetes.io/projected/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-kube-api-access-7pq6l\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.532278 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.532352 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.532377 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634019 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pq6l\" (UniqueName: \"kubernetes.io/projected/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-kube-api-access-7pq6l\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634151 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634221 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634329 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.634140 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.639476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.640527 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.641040 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.650976 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.653762 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pq6l\" (UniqueName: \"kubernetes.io/projected/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-kube-api-access-7pq6l\") pod \"cinder-scheduler-0\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " pod="openstack/cinder-scheduler-0" Mar 13 12:10:41 crc kubenswrapper[4786]: I0313 12:10:41.813910 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.265611 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:10:42 crc kubenswrapper[4786]: W0313 12:10:42.269294 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc067e1cd_e5a9_413d_9ddc_4e1f4a6a0441.slice/crio-fd51654ec66219ac04ac6664462e54d989bceffe704584118f28e4259e79ad7e WatchSource:0}: Error finding container fd51654ec66219ac04ac6664462e54d989bceffe704584118f28e4259e79ad7e: Status 404 returned error can't find the container with id fd51654ec66219ac04ac6664462e54d989bceffe704584118f28e4259e79ad7e Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.274602 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.275800 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.285328 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.285503 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.285670 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-t7w2n" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.291500 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.350767 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.351340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srfh\" (UniqueName: \"kubernetes.io/projected/c9cf93cd-d636-4947-8318-0fade89f65d7-kube-api-access-2srfh\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.351527 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.351569 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.412698 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441","Type":"ContainerStarted","Data":"fd51654ec66219ac04ac6664462e54d989bceffe704584118f28e4259e79ad7e"} Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.455508 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.455844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.456551 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.456960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srfh\" (UniqueName: \"kubernetes.io/projected/c9cf93cd-d636-4947-8318-0fade89f65d7-kube-api-access-2srfh\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.457033 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.474200 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.474813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.485665 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srfh\" (UniqueName: \"kubernetes.io/projected/c9cf93cd-d636-4947-8318-0fade89f65d7-kube-api-access-2srfh\") pod \"openstackclient\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " pod="openstack/openstackclient" Mar 13 12:10:42 crc kubenswrapper[4786]: I0313 12:10:42.719128 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:10:43 crc kubenswrapper[4786]: I0313 12:10:43.253130 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:10:43 crc kubenswrapper[4786]: I0313 12:10:43.457721 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a4f906-25da-4780-988b-444065d26080" path="/var/lib/kubelet/pods/03a4f906-25da-4780-988b-444065d26080/volumes" Mar 13 12:10:43 crc kubenswrapper[4786]: I0313 12:10:43.458495 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441","Type":"ContainerStarted","Data":"72ec4a750cf5f8f8444ba20cf0c7ee683c4b7001b32595a47908763487ea5853"} Mar 13 12:10:43 crc kubenswrapper[4786]: I0313 12:10:43.458522 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c9cf93cd-d636-4947-8318-0fade89f65d7","Type":"ContainerStarted","Data":"54ffbd9728412d5ec659ff88bcdadd4d7f87f47f96f7dd442ee0045c042a34af"} Mar 13 12:10:44 crc kubenswrapper[4786]: I0313 12:10:44.463669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441","Type":"ContainerStarted","Data":"f8c4849551e632909c931cb9c230d8766911210a5e2a92f2d4a1214b86907ca7"} Mar 13 12:10:44 crc kubenswrapper[4786]: I0313 12:10:44.615169 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 12:10:45 crc kubenswrapper[4786]: I0313 12:10:45.496671 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.496649187 podStartE2EDuration="4.496649187s" podCreationTimestamp="2026-03-13 12:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:45.491038735 +0000 UTC m=+1432.770692202" watchObservedRunningTime="2026-03-13 12:10:45.496649187 +0000 UTC m=+1432.776302654" Mar 13 12:10:46 crc kubenswrapper[4786]: I0313 12:10:46.814199 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.475315 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-dbb48765-fzcqd"] Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.491428 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-dbb48765-fzcqd"] Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.491532 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.494215 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.494418 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.497241 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-internal-tls-certs\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547167 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-etc-swift\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547199 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-log-httpd\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547270 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6nq\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-kube-api-access-rf6nq\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547294 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-combined-ca-bundle\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547355 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-run-httpd\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547397 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-config-data\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.547761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-internal-tls-certs\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650271 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-etc-swift\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-log-httpd\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650326 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6nq\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-kube-api-access-rf6nq\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-combined-ca-bundle\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650385 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-run-httpd\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650414 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-config-data\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.650435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.651850 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-log-httpd\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.652091 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-run-httpd\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.659968 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-config-data\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.660508 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.660530 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-internal-tls-certs\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.663119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-etc-swift\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.665609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-combined-ca-bundle\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.671381 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6nq\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-kube-api-access-rf6nq\") pod \"swift-proxy-dbb48765-fzcqd\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:47 crc kubenswrapper[4786]: I0313 12:10:47.824468 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.072253 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.072779 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-log" containerID="cri-o://7400bda19ed93a6aaa9008fa088ac58b20cc2ac4be45e3e0d28b2b9028eb3230" gracePeriod=30 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.073466 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-httpd" containerID="cri-o://a0309324151576503eed901fa51e562b264bd62cc01f6bb7639186ea905eebf9" gracePeriod=30 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.389407 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-dbb48765-fzcqd"] Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.516747 4786 generic.go:334] "Generic (PLEG): container finished" podID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerID="7400bda19ed93a6aaa9008fa088ac58b20cc2ac4be45e3e0d28b2b9028eb3230" exitCode=143 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.516799 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b","Type":"ContainerDied","Data":"7400bda19ed93a6aaa9008fa088ac58b20cc2ac4be45e3e0d28b2b9028eb3230"} Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.606553 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.606928 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-central-agent" containerID="cri-o://4ca295232845e30ee7782cbbc8f944cef3023c0f8b4f8fed9aa8cbb56a2c0002" gracePeriod=30 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.606966 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="sg-core" containerID="cri-o://775ce4a56412c18bd9b4661ea4c62fb2c2c8e96fc8399268d2b2a0646a581b94" gracePeriod=30 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.607050 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="proxy-httpd" containerID="cri-o://4b2785eb233bfc907642218cd7bf45b2c34261572582f57a5ec0d4f4cd2ab067" gracePeriod=30 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.607068 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-notification-agent" containerID="cri-o://c503626f2b146a24b1400a3045566bf0eb8a620ea09bc42fc74d42629127a420" gracePeriod=30 Mar 13 12:10:48 crc kubenswrapper[4786]: I0313 12:10:48.619051 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528236 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerID="4b2785eb233bfc907642218cd7bf45b2c34261572582f57a5ec0d4f4cd2ab067" exitCode=0 Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528277 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerID="775ce4a56412c18bd9b4661ea4c62fb2c2c8e96fc8399268d2b2a0646a581b94" exitCode=2 Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528287 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerID="c503626f2b146a24b1400a3045566bf0eb8a620ea09bc42fc74d42629127a420" exitCode=0 Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528295 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerID="4ca295232845e30ee7782cbbc8f944cef3023c0f8b4f8fed9aa8cbb56a2c0002" exitCode=0 Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528318 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerDied","Data":"4b2785eb233bfc907642218cd7bf45b2c34261572582f57a5ec0d4f4cd2ab067"} Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerDied","Data":"775ce4a56412c18bd9b4661ea4c62fb2c2c8e96fc8399268d2b2a0646a581b94"} Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528354 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerDied","Data":"c503626f2b146a24b1400a3045566bf0eb8a620ea09bc42fc74d42629127a420"} Mar 13 12:10:49 crc kubenswrapper[4786]: I0313 12:10:49.528363 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerDied","Data":"4ca295232845e30ee7782cbbc8f944cef3023c0f8b4f8fed9aa8cbb56a2c0002"} Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.252320 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-q9frn"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.254322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.262595 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q9frn"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.319597 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blcb6\" (UniqueName: \"kubernetes.io/projected/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-kube-api-access-blcb6\") pod \"nova-api-db-create-q9frn\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.319740 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-operator-scripts\") pod \"nova-api-db-create-q9frn\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.351156 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mfpdw"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.352623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.359713 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mfpdw"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.420440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blcb6\" (UniqueName: \"kubernetes.io/projected/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-kube-api-access-blcb6\") pod \"nova-api-db-create-q9frn\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.420927 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lnnq\" (UniqueName: \"kubernetes.io/projected/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-kube-api-access-5lnnq\") pod \"nova-cell0-db-create-mfpdw\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.421042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-operator-scripts\") pod \"nova-api-db-create-q9frn\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.421093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-operator-scripts\") pod \"nova-cell0-db-create-mfpdw\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.422240 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-operator-scripts\") pod \"nova-api-db-create-q9frn\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.446093 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blcb6\" (UniqueName: \"kubernetes.io/projected/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-kube-api-access-blcb6\") pod \"nova-api-db-create-q9frn\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.465959 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-j474k"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.467385 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.487174 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ac9a-account-create-update-rbc5m"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.488804 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.498503 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.512422 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j474k"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.525539 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804557dd-c3fc-4502-8b3f-4bcabfb93688-operator-scripts\") pod \"nova-cell1-db-create-j474k\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.525603 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-operator-scripts\") pod \"nova-cell0-db-create-mfpdw\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.525635 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lqm\" (UniqueName: \"kubernetes.io/projected/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-kube-api-access-q6lqm\") pod \"nova-api-ac9a-account-create-update-rbc5m\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.525703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-operator-scripts\") pod \"nova-api-ac9a-account-create-update-rbc5m\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.525846 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkzx\" (UniqueName: \"kubernetes.io/projected/804557dd-c3fc-4502-8b3f-4bcabfb93688-kube-api-access-sqkzx\") pod \"nova-cell1-db-create-j474k\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.525897 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lnnq\" (UniqueName: \"kubernetes.io/projected/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-kube-api-access-5lnnq\") pod \"nova-cell0-db-create-mfpdw\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.528225 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-operator-scripts\") pod \"nova-cell0-db-create-mfpdw\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.533481 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac9a-account-create-update-rbc5m"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.547314 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lnnq\" (UniqueName: \"kubernetes.io/projected/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-kube-api-access-5lnnq\") pod \"nova-cell0-db-create-mfpdw\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.558041 4786 generic.go:334] "Generic (PLEG): container finished" podID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerID="a0309324151576503eed901fa51e562b264bd62cc01f6bb7639186ea905eebf9" exitCode=0 Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.558087 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b","Type":"ContainerDied","Data":"a0309324151576503eed901fa51e562b264bd62cc01f6bb7639186ea905eebf9"} Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.577503 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.628290 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-operator-scripts\") pod \"nova-api-ac9a-account-create-update-rbc5m\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.628429 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkzx\" (UniqueName: \"kubernetes.io/projected/804557dd-c3fc-4502-8b3f-4bcabfb93688-kube-api-access-sqkzx\") pod \"nova-cell1-db-create-j474k\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.628594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804557dd-c3fc-4502-8b3f-4bcabfb93688-operator-scripts\") pod \"nova-cell1-db-create-j474k\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.628644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lqm\" (UniqueName: \"kubernetes.io/projected/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-kube-api-access-q6lqm\") pod \"nova-api-ac9a-account-create-update-rbc5m\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.629210 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804557dd-c3fc-4502-8b3f-4bcabfb93688-operator-scripts\") pod \"nova-cell1-db-create-j474k\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.629680 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-operator-scripts\") pod \"nova-api-ac9a-account-create-update-rbc5m\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.650696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lqm\" (UniqueName: \"kubernetes.io/projected/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-kube-api-access-q6lqm\") pod \"nova-api-ac9a-account-create-update-rbc5m\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.651458 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkzx\" (UniqueName: \"kubernetes.io/projected/804557dd-c3fc-4502-8b3f-4bcabfb93688-kube-api-access-sqkzx\") pod \"nova-cell1-db-create-j474k\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.667622 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ee17-account-create-update-5l6xl"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.668709 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.671541 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.677664 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ee17-account-create-update-5l6xl"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.686698 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.729932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4tgh\" (UniqueName: \"kubernetes.io/projected/f70490bf-3f7e-4490-b045-dd095a1fdd16-kube-api-access-r4tgh\") pod \"nova-cell0-ee17-account-create-update-5l6xl\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.730035 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70490bf-3f7e-4490-b045-dd095a1fdd16-operator-scripts\") pod \"nova-cell0-ee17-account-create-update-5l6xl\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.816846 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.825821 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.831692 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4tgh\" (UniqueName: \"kubernetes.io/projected/f70490bf-3f7e-4490-b045-dd095a1fdd16-kube-api-access-r4tgh\") pod \"nova-cell0-ee17-account-create-update-5l6xl\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.831792 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70490bf-3f7e-4490-b045-dd095a1fdd16-operator-scripts\") pod \"nova-cell0-ee17-account-create-update-5l6xl\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.832676 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70490bf-3f7e-4490-b045-dd095a1fdd16-operator-scripts\") pod \"nova-cell0-ee17-account-create-update-5l6xl\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.851047 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-91d2-account-create-update-kmjpl"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.852388 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.873307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4tgh\" (UniqueName: \"kubernetes.io/projected/f70490bf-3f7e-4490-b045-dd095a1fdd16-kube-api-access-r4tgh\") pod \"nova-cell0-ee17-account-create-update-5l6xl\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.873391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.878443 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-91d2-account-create-update-kmjpl"] Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.933898 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8e93b-aef1-4d5b-a40d-eaad723384cf-operator-scripts\") pod \"nova-cell1-91d2-account-create-update-kmjpl\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:51 crc kubenswrapper[4786]: I0313 12:10:51.933952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd456\" (UniqueName: \"kubernetes.io/projected/eef8e93b-aef1-4d5b-a40d-eaad723384cf-kube-api-access-fd456\") pod \"nova-cell1-91d2-account-create-update-kmjpl\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.036028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8e93b-aef1-4d5b-a40d-eaad723384cf-operator-scripts\") pod \"nova-cell1-91d2-account-create-update-kmjpl\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.036086 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd456\" (UniqueName: \"kubernetes.io/projected/eef8e93b-aef1-4d5b-a40d-eaad723384cf-kube-api-access-fd456\") pod \"nova-cell1-91d2-account-create-update-kmjpl\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.036862 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8e93b-aef1-4d5b-a40d-eaad723384cf-operator-scripts\") pod \"nova-cell1-91d2-account-create-update-kmjpl\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.052328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd456\" (UniqueName: \"kubernetes.io/projected/eef8e93b-aef1-4d5b-a40d-eaad723384cf-kube-api-access-fd456\") pod \"nova-cell1-91d2-account-create-update-kmjpl\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.063316 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.113921 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 12:10:52 crc kubenswrapper[4786]: I0313 12:10:52.226636 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:53 crc kubenswrapper[4786]: W0313 12:10:53.840717 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129e2d9e_bcc5_4fb2_815c_29d99648b1f3.slice/crio-352ae26b5bd48de42e619f7d26f8c71ff20bdba4e51f52483335312f2ce61310 WatchSource:0}: Error finding container 352ae26b5bd48de42e619f7d26f8c71ff20bdba4e51f52483335312f2ce61310: Status 404 returned error can't find the container with id 352ae26b5bd48de42e619f7d26f8c71ff20bdba4e51f52483335312f2ce61310 Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.401391 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490042 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-internal-tls-certs\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490123 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-httpd-run\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490149 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-combined-ca-bundle\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490165 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-logs\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490234 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9fj\" (UniqueName: \"kubernetes.io/projected/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-kube-api-access-hc9fj\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490287 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-config-data\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-scripts\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.490440 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\" (UID: \"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.491247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.493020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-logs" (OuterVolumeSpecName: "logs") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.498007 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-kube-api-access-hc9fj" (OuterVolumeSpecName: "kube-api-access-hc9fj") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "kube-api-access-hc9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.505939 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.507438 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-scripts" (OuterVolumeSpecName: "scripts") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.519114 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.525251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.605711 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.605766 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.605779 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.605794 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.605826 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9fj\" (UniqueName: \"kubernetes.io/projected/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-kube-api-access-hc9fj\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.605841 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.616677 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-config-data" (OuterVolumeSpecName: "config-data") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.647133 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.647140 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"47d543fb-21ea-4ff4-96e3-54de7f6d6d1b","Type":"ContainerDied","Data":"3fb8306c9237486f944233ffd6ac1e88267bb27215b5bbb7a6908c10ec2292c8"} Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.647964 4786 scope.go:117] "RemoveContainer" containerID="a0309324151576503eed901fa51e562b264bd62cc01f6bb7639186ea905eebf9" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.650059 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dbb48765-fzcqd" event={"ID":"129e2d9e-bcc5-4fb2-815c-29d99648b1f3","Type":"ContainerStarted","Data":"bd18ddf3196c9bf5a4a9b83508e122c58fe4f289252ce5dccbca45f28bb401b8"} Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.650097 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dbb48765-fzcqd" event={"ID":"129e2d9e-bcc5-4fb2-815c-29d99648b1f3","Type":"ContainerStarted","Data":"352ae26b5bd48de42e619f7d26f8c71ff20bdba4e51f52483335312f2ce61310"} Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.675519 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0b43d76-6530-436e-9c8f-a3de4193f997","Type":"ContainerDied","Data":"245bd61bd0c189010f4b671cc24741f1c33d6871f16ded771881cf02ac482201"} Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.675654 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.707908 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-scripts\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.707986 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-log-httpd\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.708048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-run-httpd\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.708083 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-config-data\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.708105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-combined-ca-bundle\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.708152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29vj2\" (UniqueName: \"kubernetes.io/projected/b0b43d76-6530-436e-9c8f-a3de4193f997-kube-api-access-29vj2\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.708181 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-sg-core-conf-yaml\") pod \"b0b43d76-6530-436e-9c8f-a3de4193f997\" (UID: \"b0b43d76-6530-436e-9c8f-a3de4193f997\") " Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.708903 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c9cf93cd-d636-4947-8318-0fade89f65d7","Type":"ContainerStarted","Data":"2196dbffe1af6cbceffdff832f9438466b94634b242cc388d431fdf75e8ded98"} Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.709102 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.710769 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.713892 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.722196 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.722359 4786 scope.go:117] "RemoveContainer" containerID="7400bda19ed93a6aaa9008fa088ac58b20cc2ac4be45e3e0d28b2b9028eb3230" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.732054 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" (UID: "47d543fb-21ea-4ff4-96e3-54de7f6d6d1b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.751615 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.977135537 podStartE2EDuration="12.751596938s" podCreationTimestamp="2026-03-13 12:10:42 +0000 UTC" firstStartedPulling="2026-03-13 12:10:43.281741108 +0000 UTC m=+1430.561394555" lastFinishedPulling="2026-03-13 12:10:54.056202509 +0000 UTC m=+1441.335855956" observedRunningTime="2026-03-13 12:10:54.743079097 +0000 UTC m=+1442.022732544" watchObservedRunningTime="2026-03-13 12:10:54.751596938 +0000 UTC m=+1442.031250385" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.754435 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-scripts" (OuterVolumeSpecName: "scripts") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.788126 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b43d76-6530-436e-9c8f-a3de4193f997-kube-api-access-29vj2" (OuterVolumeSpecName: "kube-api-access-29vj2") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "kube-api-access-29vj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813466 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813506 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813518 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813528 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0b43d76-6530-436e-9c8f-a3de4193f997-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813536 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813546 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29vj2\" (UniqueName: \"kubernetes.io/projected/b0b43d76-6530-436e-9c8f-a3de4193f997-kube-api-access-29vj2\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.813832 4786 scope.go:117] "RemoveContainer" containerID="4b2785eb233bfc907642218cd7bf45b2c34261572582f57a5ec0d4f4cd2ab067" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.823584 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.859043 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.898628 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q9frn"] Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.902082 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-config-data" (OuterVolumeSpecName: "config-data") pod "b0b43d76-6530-436e-9c8f-a3de4193f997" (UID: "b0b43d76-6530-436e-9c8f-a3de4193f997"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:54 crc kubenswrapper[4786]: W0313 12:10:54.908008 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda17ac7e3_0ac7_480c_9909_b4c3cc76696b.slice/crio-b43c88529cf7ec6dcdd149a2965a2bcf854b2595120c4c4bd9f18057ae3255dc WatchSource:0}: Error finding container b43c88529cf7ec6dcdd149a2965a2bcf854b2595120c4c4bd9f18057ae3255dc: Status 404 returned error can't find the container with id b43c88529cf7ec6dcdd149a2965a2bcf854b2595120c4c4bd9f18057ae3255dc Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.909888 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac9a-account-create-update-rbc5m"] Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.915722 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.915747 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.915758 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0b43d76-6530-436e-9c8f-a3de4193f997-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.920323 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 12:10:54 crc kubenswrapper[4786]: I0313 12:10:54.966898 4786 scope.go:117] "RemoveContainer" containerID="775ce4a56412c18bd9b4661ea4c62fb2c2c8e96fc8399268d2b2a0646a581b94" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.006473 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.034812 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.063640 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: E0313 12:10:55.065670 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-central-agent" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065690 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-central-agent" Mar 13 12:10:55 crc kubenswrapper[4786]: E0313 12:10:55.065703 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-httpd" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065709 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-httpd" Mar 13 12:10:55 crc kubenswrapper[4786]: E0313 12:10:55.065721 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="sg-core" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065727 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="sg-core" Mar 13 12:10:55 crc kubenswrapper[4786]: E0313 12:10:55.065743 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="proxy-httpd" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065748 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="proxy-httpd" Mar 13 12:10:55 crc kubenswrapper[4786]: E0313 12:10:55.065761 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-notification-agent" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065767 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-notification-agent" Mar 13 12:10:55 crc kubenswrapper[4786]: E0313 12:10:55.065782 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-log" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065788 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-log" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.065978 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-log" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.068355 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-central-agent" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.068375 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="ceilometer-notification-agent" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.068388 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" containerName="glance-httpd" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.068403 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="sg-core" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.068416 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" containerName="proxy-httpd" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.068260 4786 scope.go:117] "RemoveContainer" containerID="c503626f2b146a24b1400a3045566bf0eb8a620ea09bc42fc74d42629127a420" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.069399 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.077774 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.077937 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.080546 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.084080 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.109974 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.121074 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.123513 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.127666 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.127847 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.130540 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.168157 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j474k"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.172714 4786 scope.go:117] "RemoveContainer" containerID="4ca295232845e30ee7782cbbc8f944cef3023c0f8b4f8fed9aa8cbb56a2c0002" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.177456 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mfpdw"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.186487 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-91d2-account-create-update-kmjpl"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.194547 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ee17-account-create-update-5l6xl"] Mar 13 12:10:55 crc kubenswrapper[4786]: W0313 12:10:55.207217 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod804557dd_c3fc_4502_8b3f_4bcabfb93688.slice/crio-55d24c1e0479a7ddd2a5d8d757a757a484c9361a173909299d09d0f612b7fb75 WatchSource:0}: Error finding container 55d24c1e0479a7ddd2a5d8d757a757a484c9361a173909299d09d0f612b7fb75: Status 404 returned error can't find the container with id 55d24c1e0479a7ddd2a5d8d757a757a484c9361a173909299d09d0f612b7fb75 Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.221817 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222094 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222199 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-config-data\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-log-httpd\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhzb\" (UniqueName: \"kubernetes.io/projected/3e9745df-949d-443d-93bb-0e5b3692ccd6-kube-api-access-hnhzb\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-scripts\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.222969 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ntl\" (UniqueName: \"kubernetes.io/projected/ad4aa260-7f29-4b03-aa53-f927a39b2370-kube-api-access-j9ntl\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.223068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.223193 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.223302 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.223439 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.224257 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-run-httpd\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: W0313 12:10:55.256751 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef8e93b_aef1_4d5b_a40d_eaad723384cf.slice/crio-9cb3d4320097ba11ddcbbae5b9d6ab0650e8cfa466c15a4f30d1a530d92ededa WatchSource:0}: Error finding container 9cb3d4320097ba11ddcbbae5b9d6ab0650e8cfa466c15a4f30d1a530d92ededa: Status 404 returned error can't find the container with id 9cb3d4320097ba11ddcbbae5b9d6ab0650e8cfa466c15a4f30d1a530d92ededa Mar 13 12:10:55 crc kubenswrapper[4786]: W0313 12:10:55.257172 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf70490bf_3f7e_4490_b045_dd095a1fdd16.slice/crio-1e8c682100aaaa07ec488869519fc3a94c0fde7c2011bb9392376b52cad9d934 WatchSource:0}: Error finding container 1e8c682100aaaa07ec488869519fc3a94c0fde7c2011bb9392376b52cad9d934: Status 404 returned error can't find the container with id 1e8c682100aaaa07ec488869519fc3a94c0fde7c2011bb9392376b52cad9d934 Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.264226 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.267791 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.325854 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.325934 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.325987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-run-httpd\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326107 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-config-data\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326127 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-log-httpd\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326175 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326334 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhzb\" (UniqueName: \"kubernetes.io/projected/3e9745df-949d-443d-93bb-0e5b3692ccd6-kube-api-access-hnhzb\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326387 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326438 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-scripts\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326506 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ntl\" (UniqueName: \"kubernetes.io/projected/ad4aa260-7f29-4b03-aa53-f927a39b2370-kube-api-access-j9ntl\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326532 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.326627 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.327871 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.328973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.328970 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-log-httpd\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.329008 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-run-httpd\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.329325 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.335928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.336117 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.336956 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-config-data\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.337477 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-scripts\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.337952 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.348212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.362240 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.365722 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhzb\" (UniqueName: \"kubernetes.io/projected/3e9745df-949d-443d-93bb-0e5b3692ccd6-kube-api-access-hnhzb\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.368467 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.371011 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ntl\" (UniqueName: \"kubernetes.io/projected/ad4aa260-7f29-4b03-aa53-f927a39b2370-kube-api-access-j9ntl\") pod \"ceilometer-0\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.381247 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.387379 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.461686 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d543fb-21ea-4ff4-96e3-54de7f6d6d1b" path="/var/lib/kubelet/pods/47d543fb-21ea-4ff4-96e3-54de7f6d6d1b/volumes" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.462988 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b43d76-6530-436e-9c8f-a3de4193f997" path="/var/lib/kubelet/pods/b0b43d76-6530-436e-9c8f-a3de4193f997/volumes" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.654422 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.757480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q9frn" event={"ID":"a17ac7e3-0ac7-480c-9909-b4c3cc76696b","Type":"ContainerStarted","Data":"8d6a8cb1e53e6f557d6e7eb5500d9064d12283cd86a24ee5003fc06a11a56be3"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.757521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q9frn" event={"ID":"a17ac7e3-0ac7-480c-9909-b4c3cc76696b","Type":"ContainerStarted","Data":"b43c88529cf7ec6dcdd149a2965a2bcf854b2595120c4c4bd9f18057ae3255dc"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.781575 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-q9frn" podStartSLOduration=4.781553143 podStartE2EDuration="4.781553143s" podCreationTimestamp="2026-03-13 12:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.777264307 +0000 UTC m=+1443.056917754" watchObservedRunningTime="2026-03-13 12:10:55.781553143 +0000 UTC m=+1443.061206600" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.792459 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" event={"ID":"eef8e93b-aef1-4d5b-a40d-eaad723384cf","Type":"ContainerStarted","Data":"194f868d7b6d5ddac02b8dc95f77ded38e44b17bfc81fa80fb0e1e9b550a6775"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.792505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" event={"ID":"eef8e93b-aef1-4d5b-a40d-eaad723384cf","Type":"ContainerStarted","Data":"9cb3d4320097ba11ddcbbae5b9d6ab0650e8cfa466c15a4f30d1a530d92ededa"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.817366 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" event={"ID":"9edc8ac7-41c7-4051-aca8-9fc79e516a2b","Type":"ContainerStarted","Data":"493eb4efaa56ff53a7930283def29db5ea69112fa0c04f9828e3b5e04a2fc1b9"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.817421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" event={"ID":"9edc8ac7-41c7-4051-aca8-9fc79e516a2b","Type":"ContainerStarted","Data":"b64633c3fbff2fcb165efd1fc7f709222dea24fcde1aaaab067c67da08abbea9"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.831033 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dbb48765-fzcqd" event={"ID":"129e2d9e-bcc5-4fb2-815c-29d99648b1f3","Type":"ContainerStarted","Data":"6faefcc8f8f08a959c2efe031b6171c7238793dbc61001559ea27795b9e169c2"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.831231 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.831660 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.880299 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" event={"ID":"f70490bf-3f7e-4490-b045-dd095a1fdd16","Type":"ContainerStarted","Data":"8f4a77d63c9b8e9d37ac53c4d16ee3f4eecc62fce4ba87cde9159abeab1c14a2"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.880345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" event={"ID":"f70490bf-3f7e-4490-b045-dd095a1fdd16","Type":"ContainerStarted","Data":"1e8c682100aaaa07ec488869519fc3a94c0fde7c2011bb9392376b52cad9d934"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.906930 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" podStartSLOduration=4.896667198 podStartE2EDuration="4.896667198s" podCreationTimestamp="2026-03-13 12:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.810174061 +0000 UTC m=+1443.089827518" watchObservedRunningTime="2026-03-13 12:10:55.896667198 +0000 UTC m=+1443.176320645" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.913626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mfpdw" event={"ID":"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf","Type":"ContainerStarted","Data":"43af16d54d57e48978f9e8c2ddeef30020c41ef69f7a4f2ae087170c105f9dc4"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.918275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mfpdw" event={"ID":"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf","Type":"ContainerStarted","Data":"dc85aa187427e6e6c8ff07ce89c27ddcb050540a500e69c98321b920da87563f"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.928182 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" podStartSLOduration=4.9281623119999995 podStartE2EDuration="4.928162312s" podCreationTimestamp="2026-03-13 12:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.838811177 +0000 UTC m=+1443.118464624" watchObservedRunningTime="2026-03-13 12:10:55.928162312 +0000 UTC m=+1443.207815759" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.929837 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j474k" event={"ID":"804557dd-c3fc-4502-8b3f-4bcabfb93688","Type":"ContainerStarted","Data":"7d62ff3e67b7bcfe3dfbaba001479ada67c0df7d19973a69f85cee8431f830b1"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.929907 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j474k" event={"ID":"804557dd-c3fc-4502-8b3f-4bcabfb93688","Type":"ContainerStarted","Data":"55d24c1e0479a7ddd2a5d8d757a757a484c9361a173909299d09d0f612b7fb75"} Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.953049 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-dbb48765-fzcqd" podStartSLOduration=8.953032357 podStartE2EDuration="8.953032357s" podCreationTimestamp="2026-03-13 12:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.865337447 +0000 UTC m=+1443.144990904" watchObservedRunningTime="2026-03-13 12:10:55.953032357 +0000 UTC m=+1443.232685814" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.965182 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.971096 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" podStartSLOduration=4.971077536 podStartE2EDuration="4.971077536s" podCreationTimestamp="2026-03-13 12:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.918479999 +0000 UTC m=+1443.198133446" watchObservedRunningTime="2026-03-13 12:10:55.971077536 +0000 UTC m=+1443.250730993" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.975802 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-mfpdw" podStartSLOduration=4.975794474 podStartE2EDuration="4.975794474s" podCreationTimestamp="2026-03-13 12:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.938752449 +0000 UTC m=+1443.218405896" watchObservedRunningTime="2026-03-13 12:10:55.975794474 +0000 UTC m=+1443.255447921" Mar 13 12:10:55 crc kubenswrapper[4786]: I0313 12:10:55.996204 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-j474k" podStartSLOduration=4.996177838 podStartE2EDuration="4.996177838s" podCreationTimestamp="2026-03-13 12:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:55.950981241 +0000 UTC m=+1443.230634698" watchObservedRunningTime="2026-03-13 12:10:55.996177838 +0000 UTC m=+1443.275831285" Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.359858 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.597592 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.598077 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-log" containerID="cri-o://4386657e12cfec63465a3b15404c56ebc92434156273aa41cc0e3fb89d3392fd" gracePeriod=30 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.598452 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-httpd" containerID="cri-o://242cb13505de80d789eccfaca107cdef2f91c7dca24ac7ce426c66c1b256a47f" gracePeriod=30 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.940392 4786 generic.go:334] "Generic (PLEG): container finished" podID="eef8e93b-aef1-4d5b-a40d-eaad723384cf" containerID="194f868d7b6d5ddac02b8dc95f77ded38e44b17bfc81fa80fb0e1e9b550a6775" exitCode=0 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.940595 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" event={"ID":"eef8e93b-aef1-4d5b-a40d-eaad723384cf","Type":"ContainerDied","Data":"194f868d7b6d5ddac02b8dc95f77ded38e44b17bfc81fa80fb0e1e9b550a6775"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.942764 4786 generic.go:334] "Generic (PLEG): container finished" podID="9edc8ac7-41c7-4051-aca8-9fc79e516a2b" containerID="493eb4efaa56ff53a7930283def29db5ea69112fa0c04f9828e3b5e04a2fc1b9" exitCode=0 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.942838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" event={"ID":"9edc8ac7-41c7-4051-aca8-9fc79e516a2b","Type":"ContainerDied","Data":"493eb4efaa56ff53a7930283def29db5ea69112fa0c04f9828e3b5e04a2fc1b9"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.944092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerStarted","Data":"8010a78b01afc195a76853ded69c3026c8b76f3fb54af586bf26d2ed707f8369"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.944119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerStarted","Data":"571b8e62f128e12cacfbb6506c46a1855fc0b106379ac2848426d73a050b89f9"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.945810 4786 generic.go:334] "Generic (PLEG): container finished" podID="f70490bf-3f7e-4490-b045-dd095a1fdd16" containerID="8f4a77d63c9b8e9d37ac53c4d16ee3f4eecc62fce4ba87cde9159abeab1c14a2" exitCode=0 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.945842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" event={"ID":"f70490bf-3f7e-4490-b045-dd095a1fdd16","Type":"ContainerDied","Data":"8f4a77d63c9b8e9d37ac53c4d16ee3f4eecc62fce4ba87cde9159abeab1c14a2"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.947295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e9745df-949d-443d-93bb-0e5b3692ccd6","Type":"ContainerStarted","Data":"c855a610ac23727fbb84ed8ff32f53ad32c9347c74559efd48b339b33cf3996b"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.949360 4786 generic.go:334] "Generic (PLEG): container finished" podID="d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" containerID="43af16d54d57e48978f9e8c2ddeef30020c41ef69f7a4f2ae087170c105f9dc4" exitCode=0 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.949441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mfpdw" event={"ID":"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf","Type":"ContainerDied","Data":"43af16d54d57e48978f9e8c2ddeef30020c41ef69f7a4f2ae087170c105f9dc4"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.952094 4786 generic.go:334] "Generic (PLEG): container finished" podID="a17ac7e3-0ac7-480c-9909-b4c3cc76696b" containerID="8d6a8cb1e53e6f557d6e7eb5500d9064d12283cd86a24ee5003fc06a11a56be3" exitCode=0 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.952248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q9frn" event={"ID":"a17ac7e3-0ac7-480c-9909-b4c3cc76696b","Type":"ContainerDied","Data":"8d6a8cb1e53e6f557d6e7eb5500d9064d12283cd86a24ee5003fc06a11a56be3"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.958321 4786 generic.go:334] "Generic (PLEG): container finished" podID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerID="4386657e12cfec63465a3b15404c56ebc92434156273aa41cc0e3fb89d3392fd" exitCode=143 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.958378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0be8ed6-db24-42dd-8e7d-406ce46d2787","Type":"ContainerDied","Data":"4386657e12cfec63465a3b15404c56ebc92434156273aa41cc0e3fb89d3392fd"} Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.959645 4786 generic.go:334] "Generic (PLEG): container finished" podID="804557dd-c3fc-4502-8b3f-4bcabfb93688" containerID="7d62ff3e67b7bcfe3dfbaba001479ada67c0df7d19973a69f85cee8431f830b1" exitCode=0 Mar 13 12:10:56 crc kubenswrapper[4786]: I0313 12:10:56.960467 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j474k" event={"ID":"804557dd-c3fc-4502-8b3f-4bcabfb93688","Type":"ContainerDied","Data":"7d62ff3e67b7bcfe3dfbaba001479ada67c0df7d19973a69f85cee8431f830b1"} Mar 13 12:10:57 crc kubenswrapper[4786]: I0313 12:10:57.971517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerStarted","Data":"92113799eaee49b183a86742bf6e256427a3316668d3201346069a15e93dadff"} Mar 13 12:10:57 crc kubenswrapper[4786]: I0313 12:10:57.974832 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e9745df-949d-443d-93bb-0e5b3692ccd6","Type":"ContainerStarted","Data":"00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120"} Mar 13 12:10:57 crc kubenswrapper[4786]: I0313 12:10:57.974875 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e9745df-949d-443d-93bb-0e5b3692ccd6","Type":"ContainerStarted","Data":"bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3"} Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.028551 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.028529472 podStartE2EDuration="4.028529472s" podCreationTimestamp="2026-03-13 12:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:58.000876002 +0000 UTC m=+1445.280529449" watchObservedRunningTime="2026-03-13 12:10:58.028529472 +0000 UTC m=+1445.308182919" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.440779 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.502084 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70490bf-3f7e-4490-b045-dd095a1fdd16-operator-scripts\") pod \"f70490bf-3f7e-4490-b045-dd095a1fdd16\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.502303 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4tgh\" (UniqueName: \"kubernetes.io/projected/f70490bf-3f7e-4490-b045-dd095a1fdd16-kube-api-access-r4tgh\") pod \"f70490bf-3f7e-4490-b045-dd095a1fdd16\" (UID: \"f70490bf-3f7e-4490-b045-dd095a1fdd16\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.503900 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f70490bf-3f7e-4490-b045-dd095a1fdd16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f70490bf-3f7e-4490-b045-dd095a1fdd16" (UID: "f70490bf-3f7e-4490-b045-dd095a1fdd16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.527225 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70490bf-3f7e-4490-b045-dd095a1fdd16-kube-api-access-r4tgh" (OuterVolumeSpecName: "kube-api-access-r4tgh") pod "f70490bf-3f7e-4490-b045-dd095a1fdd16" (UID: "f70490bf-3f7e-4490-b045-dd095a1fdd16"). InnerVolumeSpecName "kube-api-access-r4tgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.594853 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.604612 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70490bf-3f7e-4490-b045-dd095a1fdd16-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.604640 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4tgh\" (UniqueName: \"kubernetes.io/projected/f70490bf-3f7e-4490-b045-dd095a1fdd16-kube-api-access-r4tgh\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.606797 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.629941 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.653566 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.655484 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.705568 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8e93b-aef1-4d5b-a40d-eaad723384cf-operator-scripts\") pod \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.705670 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-operator-scripts\") pod \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.705774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blcb6\" (UniqueName: \"kubernetes.io/projected/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-kube-api-access-blcb6\") pod \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.705865 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6lqm\" (UniqueName: \"kubernetes.io/projected/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-kube-api-access-q6lqm\") pod \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\" (UID: \"9edc8ac7-41c7-4051-aca8-9fc79e516a2b\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.705916 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd456\" (UniqueName: \"kubernetes.io/projected/eef8e93b-aef1-4d5b-a40d-eaad723384cf-kube-api-access-fd456\") pod \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\" (UID: \"eef8e93b-aef1-4d5b-a40d-eaad723384cf\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.705987 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-operator-scripts\") pod \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\" (UID: \"a17ac7e3-0ac7-480c-9909-b4c3cc76696b\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.706081 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef8e93b-aef1-4d5b-a40d-eaad723384cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eef8e93b-aef1-4d5b-a40d-eaad723384cf" (UID: "eef8e93b-aef1-4d5b-a40d-eaad723384cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.706827 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9edc8ac7-41c7-4051-aca8-9fc79e516a2b" (UID: "9edc8ac7-41c7-4051-aca8-9fc79e516a2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.707128 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8e93b-aef1-4d5b-a40d-eaad723384cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.707149 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.707480 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a17ac7e3-0ac7-480c-9909-b4c3cc76696b" (UID: "a17ac7e3-0ac7-480c-9909-b4c3cc76696b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.709759 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-kube-api-access-q6lqm" (OuterVolumeSpecName: "kube-api-access-q6lqm") pod "9edc8ac7-41c7-4051-aca8-9fc79e516a2b" (UID: "9edc8ac7-41c7-4051-aca8-9fc79e516a2b"). InnerVolumeSpecName "kube-api-access-q6lqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.712075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-kube-api-access-blcb6" (OuterVolumeSpecName: "kube-api-access-blcb6") pod "a17ac7e3-0ac7-480c-9909-b4c3cc76696b" (UID: "a17ac7e3-0ac7-480c-9909-b4c3cc76696b"). InnerVolumeSpecName "kube-api-access-blcb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.712418 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef8e93b-aef1-4d5b-a40d-eaad723384cf-kube-api-access-fd456" (OuterVolumeSpecName: "kube-api-access-fd456") pod "eef8e93b-aef1-4d5b-a40d-eaad723384cf" (UID: "eef8e93b-aef1-4d5b-a40d-eaad723384cf"). InnerVolumeSpecName "kube-api-access-fd456". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.808437 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804557dd-c3fc-4502-8b3f-4bcabfb93688-operator-scripts\") pod \"804557dd-c3fc-4502-8b3f-4bcabfb93688\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.808732 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-operator-scripts\") pod \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.808774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqkzx\" (UniqueName: \"kubernetes.io/projected/804557dd-c3fc-4502-8b3f-4bcabfb93688-kube-api-access-sqkzx\") pod \"804557dd-c3fc-4502-8b3f-4bcabfb93688\" (UID: \"804557dd-c3fc-4502-8b3f-4bcabfb93688\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.808922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lnnq\" (UniqueName: \"kubernetes.io/projected/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-kube-api-access-5lnnq\") pod \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\" (UID: \"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf\") " Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.809412 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blcb6\" (UniqueName: \"kubernetes.io/projected/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-kube-api-access-blcb6\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.809414 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804557dd-c3fc-4502-8b3f-4bcabfb93688-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "804557dd-c3fc-4502-8b3f-4bcabfb93688" (UID: "804557dd-c3fc-4502-8b3f-4bcabfb93688"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.809434 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6lqm\" (UniqueName: \"kubernetes.io/projected/9edc8ac7-41c7-4051-aca8-9fc79e516a2b-kube-api-access-q6lqm\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.809412 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" (UID: "d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.809477 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd456\" (UniqueName: \"kubernetes.io/projected/eef8e93b-aef1-4d5b-a40d-eaad723384cf-kube-api-access-fd456\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.809502 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17ac7e3-0ac7-480c-9909-b4c3cc76696b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.812127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-kube-api-access-5lnnq" (OuterVolumeSpecName: "kube-api-access-5lnnq") pod "d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" (UID: "d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf"). InnerVolumeSpecName "kube-api-access-5lnnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.812170 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804557dd-c3fc-4502-8b3f-4bcabfb93688-kube-api-access-sqkzx" (OuterVolumeSpecName: "kube-api-access-sqkzx") pod "804557dd-c3fc-4502-8b3f-4bcabfb93688" (UID: "804557dd-c3fc-4502-8b3f-4bcabfb93688"). InnerVolumeSpecName "kube-api-access-sqkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.911489 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804557dd-c3fc-4502-8b3f-4bcabfb93688-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.911526 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.911542 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqkzx\" (UniqueName: \"kubernetes.io/projected/804557dd-c3fc-4502-8b3f-4bcabfb93688-kube-api-access-sqkzx\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.911556 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lnnq\" (UniqueName: \"kubernetes.io/projected/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf-kube-api-access-5lnnq\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.987335 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mfpdw" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.987319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mfpdw" event={"ID":"d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf","Type":"ContainerDied","Data":"dc85aa187427e6e6c8ff07ce89c27ddcb050540a500e69c98321b920da87563f"} Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.987479 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc85aa187427e6e6c8ff07ce89c27ddcb050540a500e69c98321b920da87563f" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.989325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" event={"ID":"eef8e93b-aef1-4d5b-a40d-eaad723384cf","Type":"ContainerDied","Data":"9cb3d4320097ba11ddcbbae5b9d6ab0650e8cfa466c15a4f30d1a530d92ededa"} Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.989354 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-91d2-account-create-update-kmjpl" Mar 13 12:10:58 crc kubenswrapper[4786]: I0313 12:10:58.989370 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb3d4320097ba11ddcbbae5b9d6ab0650e8cfa466c15a4f30d1a530d92ededa" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.005561 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.005568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac9a-account-create-update-rbc5m" event={"ID":"9edc8ac7-41c7-4051-aca8-9fc79e516a2b","Type":"ContainerDied","Data":"b64633c3fbff2fcb165efd1fc7f709222dea24fcde1aaaab067c67da08abbea9"} Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.005606 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b64633c3fbff2fcb165efd1fc7f709222dea24fcde1aaaab067c67da08abbea9" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.008685 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j474k" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.008713 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j474k" event={"ID":"804557dd-c3fc-4502-8b3f-4bcabfb93688","Type":"ContainerDied","Data":"55d24c1e0479a7ddd2a5d8d757a757a484c9361a173909299d09d0f612b7fb75"} Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.009212 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55d24c1e0479a7ddd2a5d8d757a757a484c9361a173909299d09d0f612b7fb75" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.012938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerStarted","Data":"2292af607b7c7d4059f8d4cfc86f189cb77f23f83575ba992f5cbd085abaab12"} Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.020801 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q9frn" event={"ID":"a17ac7e3-0ac7-480c-9909-b4c3cc76696b","Type":"ContainerDied","Data":"b43c88529cf7ec6dcdd149a2965a2bcf854b2595120c4c4bd9f18057ae3255dc"} Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.020846 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43c88529cf7ec6dcdd149a2965a2bcf854b2595120c4c4bd9f18057ae3255dc" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.021001 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q9frn" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.025031 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.025095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee17-account-create-update-5l6xl" event={"ID":"f70490bf-3f7e-4490-b045-dd095a1fdd16","Type":"ContainerDied","Data":"1e8c682100aaaa07ec488869519fc3a94c0fde7c2011bb9392376b52cad9d934"} Mar 13 12:10:59 crc kubenswrapper[4786]: I0313 12:10:59.025125 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8c682100aaaa07ec488869519fc3a94c0fde7c2011bb9392376b52cad9d934" Mar 13 12:11:00 crc kubenswrapper[4786]: I0313 12:11:00.034343 4786 generic.go:334] "Generic (PLEG): container finished" podID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerID="242cb13505de80d789eccfaca107cdef2f91c7dca24ac7ce426c66c1b256a47f" exitCode=0 Mar 13 12:11:00 crc kubenswrapper[4786]: I0313 12:11:00.034397 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0be8ed6-db24-42dd-8e7d-406ce46d2787","Type":"ContainerDied","Data":"242cb13505de80d789eccfaca107cdef2f91c7dca24ac7ce426c66c1b256a47f"} Mar 13 12:11:01 crc kubenswrapper[4786]: I0313 12:11:01.582438 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Mar 13 12:11:01 crc kubenswrapper[4786]: I0313 12:11:01.582551 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.050807 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jqvv"] Mar 13 12:11:02 crc kubenswrapper[4786]: E0313 12:11:02.051360 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edc8ac7-41c7-4051-aca8-9fc79e516a2b" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051396 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edc8ac7-41c7-4051-aca8-9fc79e516a2b" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: E0313 12:11:02.051420 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70490bf-3f7e-4490-b045-dd095a1fdd16" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051430 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70490bf-3f7e-4490-b045-dd095a1fdd16" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: E0313 12:11:02.051450 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef8e93b-aef1-4d5b-a40d-eaad723384cf" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051461 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef8e93b-aef1-4d5b-a40d-eaad723384cf" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: E0313 12:11:02.051481 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051491 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: E0313 12:11:02.051532 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804557dd-c3fc-4502-8b3f-4bcabfb93688" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051543 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="804557dd-c3fc-4502-8b3f-4bcabfb93688" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: E0313 12:11:02.051563 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17ac7e3-0ac7-480c-9909-b4c3cc76696b" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051574 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17ac7e3-0ac7-480c-9909-b4c3cc76696b" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051842 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="804557dd-c3fc-4502-8b3f-4bcabfb93688" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051900 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051954 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70490bf-3f7e-4490-b045-dd095a1fdd16" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051977 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef8e93b-aef1-4d5b-a40d-eaad723384cf" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.051998 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17ac7e3-0ac7-480c-9909-b4c3cc76696b" containerName="mariadb-database-create" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.052013 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edc8ac7-41c7-4051-aca8-9fc79e516a2b" containerName="mariadb-account-create-update" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.053023 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.055279 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.056765 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bx625" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.064643 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.065084 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jqvv"] Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.181249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-scripts\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.181623 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.181653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnbmv\" (UniqueName: \"kubernetes.io/projected/085820e1-a384-4656-8200-bb5ae71491ae-kube-api-access-pnbmv\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.181686 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-config-data\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.283083 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-scripts\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.283191 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.283220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnbmv\" (UniqueName: \"kubernetes.io/projected/085820e1-a384-4656-8200-bb5ae71491ae-kube-api-access-pnbmv\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.283253 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-config-data\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.306742 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-config-data\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.307335 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.307577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-scripts\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.327217 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnbmv\" (UniqueName: \"kubernetes.io/projected/085820e1-a384-4656-8200-bb5ae71491ae-kube-api-access-pnbmv\") pod \"nova-cell0-conductor-db-sync-4jqvv\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.373367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.462015 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598372 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-httpd-run\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598440 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-logs\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598486 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-config-data\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598523 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-scripts\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598586 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-public-tls-certs\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598703 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cbt5\" (UniqueName: \"kubernetes.io/projected/e0be8ed6-db24-42dd-8e7d-406ce46d2787-kube-api-access-4cbt5\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.598874 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-combined-ca-bundle\") pod \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\" (UID: \"e0be8ed6-db24-42dd-8e7d-406ce46d2787\") " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.599076 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-logs" (OuterVolumeSpecName: "logs") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.599374 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.600210 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.604965 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-scripts" (OuterVolumeSpecName: "scripts") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.611482 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0be8ed6-db24-42dd-8e7d-406ce46d2787-kube-api-access-4cbt5" (OuterVolumeSpecName: "kube-api-access-4cbt5") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "kube-api-access-4cbt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.612141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.636922 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.667661 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.683544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-config-data" (OuterVolumeSpecName: "config-data") pod "e0be8ed6-db24-42dd-8e7d-406ce46d2787" (UID: "e0be8ed6-db24-42dd-8e7d-406ce46d2787"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.704544 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.704604 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.705597 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cbt5\" (UniqueName: \"kubernetes.io/projected/e0be8ed6-db24-42dd-8e7d-406ce46d2787-kube-api-access-4cbt5\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.705629 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.705642 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0be8ed6-db24-42dd-8e7d-406ce46d2787-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.705653 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.705722 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0be8ed6-db24-42dd-8e7d-406ce46d2787-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.725786 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.807514 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.836433 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.839043 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:11:02 crc kubenswrapper[4786]: I0313 12:11:02.890668 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jqvv"] Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.070055 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerStarted","Data":"cff7926761705c50437386792d09ade1b7a9bd832aed9b0dd39ec5cd5d4a827d"} Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.070335 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.073891 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0be8ed6-db24-42dd-8e7d-406ce46d2787","Type":"ContainerDied","Data":"75f9028b0da1b0c4965f25053bfeefd13e903ecdaf4334db6c4cbfc098a2559c"} Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.074000 4786 scope.go:117] "RemoveContainer" containerID="242cb13505de80d789eccfaca107cdef2f91c7dca24ac7ce426c66c1b256a47f" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.073926 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.077420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" event={"ID":"085820e1-a384-4656-8200-bb5ae71491ae","Type":"ContainerStarted","Data":"d652f5dd5f904d419a3f74bc1be78fecfc5d2e49874a87bc9625dad4b919e005"} Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.101099 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9497554419999998 podStartE2EDuration="8.101079439s" podCreationTimestamp="2026-03-13 12:10:55 +0000 UTC" firstStartedPulling="2026-03-13 12:10:55.940537908 +0000 UTC m=+1443.220191355" lastFinishedPulling="2026-03-13 12:11:02.091861905 +0000 UTC m=+1449.371515352" observedRunningTime="2026-03-13 12:11:03.092868276 +0000 UTC m=+1450.372521743" watchObservedRunningTime="2026-03-13 12:11:03.101079439 +0000 UTC m=+1450.380732906" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.118560 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.121809 4786 scope.go:117] "RemoveContainer" containerID="4386657e12cfec63465a3b15404c56ebc92434156273aa41cc0e3fb89d3392fd" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.130122 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.165914 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:11:03 crc kubenswrapper[4786]: E0313 12:11:03.166310 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-httpd" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.166329 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-httpd" Mar 13 12:11:03 crc kubenswrapper[4786]: E0313 12:11:03.166353 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-log" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.166361 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-log" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.166510 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-httpd" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.166530 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" containerName="glance-log" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.167423 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.173437 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.173673 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.185295 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326317 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326359 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326384 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-logs\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s749p\" (UniqueName: \"kubernetes.io/projected/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-kube-api-access-s749p\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326486 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326501 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.326569 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428571 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428618 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s749p\" (UniqueName: \"kubernetes.io/projected/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-kube-api-access-s749p\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428668 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428733 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428794 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.428829 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-logs\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.429245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-logs\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.430393 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.430734 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.435124 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.435678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.435709 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.437579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.452294 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0be8ed6-db24-42dd-8e7d-406ce46d2787" path="/var/lib/kubelet/pods/e0be8ed6-db24-42dd-8e7d-406ce46d2787/volumes" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.458735 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s749p\" (UniqueName: \"kubernetes.io/projected/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-kube-api-access-s749p\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.459720 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " pod="openstack/glance-default-external-api-0" Mar 13 12:11:03 crc kubenswrapper[4786]: I0313 12:11:03.513579 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:11:04 crc kubenswrapper[4786]: I0313 12:11:04.110051 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:11:04 crc kubenswrapper[4786]: W0313 12:11:04.110858 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab23f85_03a5_4df3_bfa8_da6f748f44e3.slice/crio-db587533ffe68bfd326df85ca0eeb44b7da6d8e17ff5b9a8fb4b15d34c97a2bf WatchSource:0}: Error finding container db587533ffe68bfd326df85ca0eeb44b7da6d8e17ff5b9a8fb4b15d34c97a2bf: Status 404 returned error can't find the container with id db587533ffe68bfd326df85ca0eeb44b7da6d8e17ff5b9a8fb4b15d34c97a2bf Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.122223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ab23f85-03a5-4df3-bfa8-da6f748f44e3","Type":"ContainerStarted","Data":"b9aec14b391a1bbbd8f466a3df6625873e9ec6de58fe63728da3a16855652999"} Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.122609 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ab23f85-03a5-4df3-bfa8-da6f748f44e3","Type":"ContainerStarted","Data":"db587533ffe68bfd326df85ca0eeb44b7da6d8e17ff5b9a8fb4b15d34c97a2bf"} Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.575993 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.639620 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6695497cb-f75lw"] Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.644181 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6695497cb-f75lw" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-httpd" containerID="cri-o://fd9f41b8b3315b8b8e67a93d2da46fd1cc3e421939704c02fca39b112c8be3ad" gracePeriod=30 Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.644342 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6695497cb-f75lw" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-api" containerID="cri-o://ba1170c5d3d38880fad622163eb1a2c2fe21ed2114dd24e5037faabbbe295718" gracePeriod=30 Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.667179 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.667225 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.695270 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:05 crc kubenswrapper[4786]: I0313 12:11:05.736756 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:06 crc kubenswrapper[4786]: I0313 12:11:06.136900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ab23f85-03a5-4df3-bfa8-da6f748f44e3","Type":"ContainerStarted","Data":"2a0a684c9b4c3a0e217496dbf85c36bb9e3e0ef4e9a768baeea5f590927cf39d"} Mar 13 12:11:06 crc kubenswrapper[4786]: I0313 12:11:06.152709 4786 generic.go:334] "Generic (PLEG): container finished" podID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerID="fd9f41b8b3315b8b8e67a93d2da46fd1cc3e421939704c02fca39b112c8be3ad" exitCode=0 Mar 13 12:11:06 crc kubenswrapper[4786]: I0313 12:11:06.152849 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6695497cb-f75lw" event={"ID":"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e","Type":"ContainerDied","Data":"fd9f41b8b3315b8b8e67a93d2da46fd1cc3e421939704c02fca39b112c8be3ad"} Mar 13 12:11:06 crc kubenswrapper[4786]: I0313 12:11:06.153261 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:06 crc kubenswrapper[4786]: I0313 12:11:06.153280 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:06 crc kubenswrapper[4786]: I0313 12:11:06.173559 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.173535616 podStartE2EDuration="3.173535616s" podCreationTimestamp="2026-03-13 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:06.15965867 +0000 UTC m=+1453.439312137" watchObservedRunningTime="2026-03-13 12:11:06.173535616 +0000 UTC m=+1453.453189073" Mar 13 12:11:07 crc kubenswrapper[4786]: I0313 12:11:07.371308 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:07 crc kubenswrapper[4786]: I0313 12:11:07.371630 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-central-agent" containerID="cri-o://8010a78b01afc195a76853ded69c3026c8b76f3fb54af586bf26d2ed707f8369" gracePeriod=30 Mar 13 12:11:07 crc kubenswrapper[4786]: I0313 12:11:07.371986 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="proxy-httpd" containerID="cri-o://cff7926761705c50437386792d09ade1b7a9bd832aed9b0dd39ec5cd5d4a827d" gracePeriod=30 Mar 13 12:11:07 crc kubenswrapper[4786]: I0313 12:11:07.371986 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-notification-agent" containerID="cri-o://92113799eaee49b183a86742bf6e256427a3316668d3201346069a15e93dadff" gracePeriod=30 Mar 13 12:11:07 crc kubenswrapper[4786]: I0313 12:11:07.371989 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="sg-core" containerID="cri-o://2292af607b7c7d4059f8d4cfc86f189cb77f23f83575ba992f5cbd085abaab12" gracePeriod=30 Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.169040 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.169098 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180123 4786 generic.go:334] "Generic (PLEG): container finished" podID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerID="cff7926761705c50437386792d09ade1b7a9bd832aed9b0dd39ec5cd5d4a827d" exitCode=0 Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180166 4786 generic.go:334] "Generic (PLEG): container finished" podID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerID="2292af607b7c7d4059f8d4cfc86f189cb77f23f83575ba992f5cbd085abaab12" exitCode=2 Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180178 4786 generic.go:334] "Generic (PLEG): container finished" podID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerID="92113799eaee49b183a86742bf6e256427a3316668d3201346069a15e93dadff" exitCode=0 Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180189 4786 generic.go:334] "Generic (PLEG): container finished" podID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerID="8010a78b01afc195a76853ded69c3026c8b76f3fb54af586bf26d2ed707f8369" exitCode=0 Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerDied","Data":"cff7926761705c50437386792d09ade1b7a9bd832aed9b0dd39ec5cd5d4a827d"} Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180264 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerDied","Data":"2292af607b7c7d4059f8d4cfc86f189cb77f23f83575ba992f5cbd085abaab12"} Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerDied","Data":"92113799eaee49b183a86742bf6e256427a3316668d3201346069a15e93dadff"} Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.180283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerDied","Data":"8010a78b01afc195a76853ded69c3026c8b76f3fb54af586bf26d2ed707f8369"} Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.235797 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.235913 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:11:08 crc kubenswrapper[4786]: I0313 12:11:08.252256 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:11:10 crc kubenswrapper[4786]: I0313 12:11:10.622276 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:11:10 crc kubenswrapper[4786]: I0313 12:11:10.669364 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:11:10 crc kubenswrapper[4786]: I0313 12:11:10.741360 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7d4596df6b-5xl5b"] Mar 13 12:11:10 crc kubenswrapper[4786]: I0313 12:11:10.741672 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7d4596df6b-5xl5b" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-log" containerID="cri-o://75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873" gracePeriod=30 Mar 13 12:11:10 crc kubenswrapper[4786]: I0313 12:11:10.742188 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7d4596df6b-5xl5b" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-api" containerID="cri-o://a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c" gracePeriod=30 Mar 13 12:11:11 crc kubenswrapper[4786]: I0313 12:11:11.230855 4786 generic.go:334] "Generic (PLEG): container finished" podID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerID="75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873" exitCode=143 Mar 13 12:11:11 crc kubenswrapper[4786]: I0313 12:11:11.230919 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4596df6b-5xl5b" event={"ID":"b56e5879-af0d-47cc-8ce9-0bc5437c77f3","Type":"ContainerDied","Data":"75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873"} Mar 13 12:11:12 crc kubenswrapper[4786]: I0313 12:11:12.248334 4786 generic.go:334] "Generic (PLEG): container finished" podID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerID="ba1170c5d3d38880fad622163eb1a2c2fe21ed2114dd24e5037faabbbe295718" exitCode=0 Mar 13 12:11:12 crc kubenswrapper[4786]: I0313 12:11:12.248387 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6695497cb-f75lw" event={"ID":"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e","Type":"ContainerDied","Data":"ba1170c5d3d38880fad622163eb1a2c2fe21ed2114dd24e5037faabbbe295718"} Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.125217 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.138920 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.155223 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-ovndb-tls-certs\") pod \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.155286 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-scripts\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.155321 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-run-httpd\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.155348 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-combined-ca-bundle\") pod \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.156161 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157350 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsfpv\" (UniqueName: \"kubernetes.io/projected/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-kube-api-access-nsfpv\") pod \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9ntl\" (UniqueName: \"kubernetes.io/projected/ad4aa260-7f29-4b03-aa53-f927a39b2370-kube-api-access-j9ntl\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157605 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-config-data\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157658 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-combined-ca-bundle\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157719 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-log-httpd\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-httpd-config\") pod \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157768 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-config\") pod \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\" (UID: \"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.157824 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-sg-core-conf-yaml\") pod \"ad4aa260-7f29-4b03-aa53-f927a39b2370\" (UID: \"ad4aa260-7f29-4b03-aa53-f927a39b2370\") " Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.159193 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.160470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.182815 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" (UID: "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.183152 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-scripts" (OuterVolumeSpecName: "scripts") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.184084 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-kube-api-access-nsfpv" (OuterVolumeSpecName: "kube-api-access-nsfpv") pod "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" (UID: "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e"). InnerVolumeSpecName "kube-api-access-nsfpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.185168 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4aa260-7f29-4b03-aa53-f927a39b2370-kube-api-access-j9ntl" (OuterVolumeSpecName: "kube-api-access-j9ntl") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "kube-api-access-j9ntl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.231034 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-config" (OuterVolumeSpecName: "config") pod "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" (UID: "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.242117 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.261745 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.262444 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.262536 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsfpv\" (UniqueName: \"kubernetes.io/projected/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-kube-api-access-nsfpv\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.262610 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9ntl\" (UniqueName: \"kubernetes.io/projected/ad4aa260-7f29-4b03-aa53-f927a39b2370-kube-api-access-j9ntl\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.262685 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4aa260-7f29-4b03-aa53-f927a39b2370-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.262756 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.262831 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.269408 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" (UID: "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.271511 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6695497cb-f75lw" event={"ID":"cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e","Type":"ContainerDied","Data":"2202dbfe91b91a147eca2995f53ab20ddf28c423277a61fcbcdcdccdd586bc99"} Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.271570 4786 scope.go:117] "RemoveContainer" containerID="fd9f41b8b3315b8b8e67a93d2da46fd1cc3e421939704c02fca39b112c8be3ad" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.271744 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6695497cb-f75lw" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.280613 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.287211 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad4aa260-7f29-4b03-aa53-f927a39b2370","Type":"ContainerDied","Data":"571b8e62f128e12cacfbb6506c46a1855fc0b106379ac2848426d73a050b89f9"} Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.287220 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.293830 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" (UID: "cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.318398 4786 scope.go:117] "RemoveContainer" containerID="ba1170c5d3d38880fad622163eb1a2c2fe21ed2114dd24e5037faabbbe295718" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.322062 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-config-data" (OuterVolumeSpecName: "config-data") pod "ad4aa260-7f29-4b03-aa53-f927a39b2370" (UID: "ad4aa260-7f29-4b03-aa53-f927a39b2370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.338802 4786 scope.go:117] "RemoveContainer" containerID="cff7926761705c50437386792d09ade1b7a9bd832aed9b0dd39ec5cd5d4a827d" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.356807 4786 scope.go:117] "RemoveContainer" containerID="2292af607b7c7d4059f8d4cfc86f189cb77f23f83575ba992f5cbd085abaab12" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.364180 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.364208 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.364218 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.364228 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4aa260-7f29-4b03-aa53-f927a39b2370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.438860 4786 scope.go:117] "RemoveContainer" containerID="92113799eaee49b183a86742bf6e256427a3316668d3201346069a15e93dadff" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.459822 4786 scope.go:117] "RemoveContainer" containerID="8010a78b01afc195a76853ded69c3026c8b76f3fb54af586bf26d2ed707f8369" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.514058 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.514112 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.542486 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.550954 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.617571 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6695497cb-f75lw"] Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.625324 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6695497cb-f75lw"] Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.641543 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.650045 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663065 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:13 crc kubenswrapper[4786]: E0313 12:11:13.663406 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-api" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663419 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-api" Mar 13 12:11:13 crc kubenswrapper[4786]: E0313 12:11:13.663439 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-central-agent" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663446 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-central-agent" Mar 13 12:11:13 crc kubenswrapper[4786]: E0313 12:11:13.663455 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="proxy-httpd" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663461 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="proxy-httpd" Mar 13 12:11:13 crc kubenswrapper[4786]: E0313 12:11:13.663472 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-httpd" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663479 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-httpd" Mar 13 12:11:13 crc kubenswrapper[4786]: E0313 12:11:13.663496 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="sg-core" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663502 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="sg-core" Mar 13 12:11:13 crc kubenswrapper[4786]: E0313 12:11:13.663515 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-notification-agent" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663520 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-notification-agent" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663684 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-api" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663697 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-notification-agent" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663710 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="ceilometer-central-agent" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663721 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" containerName="neutron-httpd" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663731 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="sg-core" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.663744 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" containerName="proxy-httpd" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.665346 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.667949 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.673188 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.676402 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.770358 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-log-httpd\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.770427 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-config-data\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.770592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-run-httpd\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.770723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652fc\" (UniqueName: \"kubernetes.io/projected/2b5f7365-eee1-4011-8580-8aed69f6a457-kube-api-access-652fc\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.770764 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.770832 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.771001 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-scripts\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.872899 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-log-httpd\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.872960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-config-data\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.872999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-run-httpd\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.873035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652fc\" (UniqueName: \"kubernetes.io/projected/2b5f7365-eee1-4011-8580-8aed69f6a457-kube-api-access-652fc\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.873057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.873087 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.873158 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-scripts\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.873352 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-log-httpd\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.873443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-run-httpd\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.878137 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.879239 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.880969 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-config-data\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.885609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-scripts\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.899750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652fc\" (UniqueName: \"kubernetes.io/projected/2b5f7365-eee1-4011-8580-8aed69f6a457-kube-api-access-652fc\") pod \"ceilometer-0\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " pod="openstack/ceilometer-0" Mar 13 12:11:13 crc kubenswrapper[4786]: I0313 12:11:13.982653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.246846 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.279773 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-scripts\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.279844 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-public-tls-certs\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.279937 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5prl\" (UniqueName: \"kubernetes.io/projected/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-kube-api-access-r5prl\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.279964 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-config-data\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.280007 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-combined-ca-bundle\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.280102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-logs\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.280133 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-internal-tls-certs\") pod \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\" (UID: \"b56e5879-af0d-47cc-8ce9-0bc5437c77f3\") " Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.283756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-logs" (OuterVolumeSpecName: "logs") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.290552 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-scripts" (OuterVolumeSpecName: "scripts") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.296345 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-kube-api-access-r5prl" (OuterVolumeSpecName: "kube-api-access-r5prl") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "kube-api-access-r5prl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.302991 4786 generic.go:334] "Generic (PLEG): container finished" podID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerID="a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c" exitCode=0 Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.303067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4596df6b-5xl5b" event={"ID":"b56e5879-af0d-47cc-8ce9-0bc5437c77f3","Type":"ContainerDied","Data":"a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c"} Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.303098 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4596df6b-5xl5b" event={"ID":"b56e5879-af0d-47cc-8ce9-0bc5437c77f3","Type":"ContainerDied","Data":"481d4fcf9dc56ab905bca2c38cecf0e8b2c75a82907616098dbdb54976b11a8b"} Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.303119 4786 scope.go:117] "RemoveContainer" containerID="a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.303241 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d4596df6b-5xl5b" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.314847 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" event={"ID":"085820e1-a384-4656-8200-bb5ae71491ae","Type":"ContainerStarted","Data":"a4508a806bab3c42762b33d4b026a3718e0e71d0df4154080451f18edf45ef82"} Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.323972 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.324244 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.342158 4786 scope.go:117] "RemoveContainer" containerID="75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.356474 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.357844 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" podStartSLOduration=2.083168666 podStartE2EDuration="12.357829106s" podCreationTimestamp="2026-03-13 12:11:02 +0000 UTC" firstStartedPulling="2026-03-13 12:11:02.898547623 +0000 UTC m=+1450.178201070" lastFinishedPulling="2026-03-13 12:11:13.173208063 +0000 UTC m=+1460.452861510" observedRunningTime="2026-03-13 12:11:14.33956102 +0000 UTC m=+1461.619214487" watchObservedRunningTime="2026-03-13 12:11:14.357829106 +0000 UTC m=+1461.637482553" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.364713 4786 scope.go:117] "RemoveContainer" containerID="a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c" Mar 13 12:11:14 crc kubenswrapper[4786]: E0313 12:11:14.365310 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c\": container with ID starting with a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c not found: ID does not exist" containerID="a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.365352 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c"} err="failed to get container status \"a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c\": rpc error: code = NotFound desc = could not find container \"a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c\": container with ID starting with a4111bac2b27d94c0c61835076d95f077118bc6fc3fb050e958bda1c81da4d2c not found: ID does not exist" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.365377 4786 scope.go:117] "RemoveContainer" containerID="75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873" Mar 13 12:11:14 crc kubenswrapper[4786]: E0313 12:11:14.365765 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873\": container with ID starting with 75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873 not found: ID does not exist" containerID="75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.365934 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873"} err="failed to get container status \"75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873\": rpc error: code = NotFound desc = could not find container \"75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873\": container with ID starting with 75a38ebe2cf2af26292cf31fa8e72e3a1a19dfb2027a28f75d371e27c2fd0873 not found: ID does not exist" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.387065 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-config-data" (OuterVolumeSpecName: "config-data") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.388834 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5prl\" (UniqueName: \"kubernetes.io/projected/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-kube-api-access-r5prl\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.388872 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.388907 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.388919 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.388932 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.448337 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.458996 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b56e5879-af0d-47cc-8ce9-0bc5437c77f3" (UID: "b56e5879-af0d-47cc-8ce9-0bc5437c77f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.479154 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.495345 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.495386 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56e5879-af0d-47cc-8ce9-0bc5437c77f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.639419 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7d4596df6b-5xl5b"] Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.649453 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7d4596df6b-5xl5b"] Mar 13 12:11:14 crc kubenswrapper[4786]: I0313 12:11:14.879739 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:15 crc kubenswrapper[4786]: I0313 12:11:15.332053 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerStarted","Data":"b5d51382073a9ddcbeae9789fb275da029227523f126671bdc23a3dd3e89eac0"} Mar 13 12:11:15 crc kubenswrapper[4786]: I0313 12:11:15.333520 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerStarted","Data":"7f469fb4fe646a510979c6bf5cc1c54e57d89d9bd46b255eefd7481682b4dd47"} Mar 13 12:11:15 crc kubenswrapper[4786]: I0313 12:11:15.452579 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4aa260-7f29-4b03-aa53-f927a39b2370" path="/var/lib/kubelet/pods/ad4aa260-7f29-4b03-aa53-f927a39b2370/volumes" Mar 13 12:11:15 crc kubenswrapper[4786]: I0313 12:11:15.453621 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" path="/var/lib/kubelet/pods/b56e5879-af0d-47cc-8ce9-0bc5437c77f3/volumes" Mar 13 12:11:15 crc kubenswrapper[4786]: I0313 12:11:15.454354 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e" path="/var/lib/kubelet/pods/cd9ff8d2-f59a-430a-93dd-ad6df3ad0a8e/volumes" Mar 13 12:11:16 crc kubenswrapper[4786]: I0313 12:11:16.325300 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:11:16 crc kubenswrapper[4786]: I0313 12:11:16.344956 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:11:16 crc kubenswrapper[4786]: I0313 12:11:16.386107 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerStarted","Data":"0ac793b16473abff4126e0889f2b7929ca4fca314094cc8a41f73c19fbe0314c"} Mar 13 12:11:17 crc kubenswrapper[4786]: I0313 12:11:17.394773 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerStarted","Data":"2b0f393b507180cb42858e60aba608f5adfb8a858cae057e57251cb75f11a6ec"} Mar 13 12:11:18 crc kubenswrapper[4786]: I0313 12:11:18.407755 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerStarted","Data":"925f7968dfead0196e87a2fbaeb59ac9773886541441c662666b1e4b66cada73"} Mar 13 12:11:19 crc kubenswrapper[4786]: I0313 12:11:19.415839 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-central-agent" containerID="cri-o://b5d51382073a9ddcbeae9789fb275da029227523f126671bdc23a3dd3e89eac0" gracePeriod=30 Mar 13 12:11:19 crc kubenswrapper[4786]: I0313 12:11:19.415905 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:11:19 crc kubenswrapper[4786]: I0313 12:11:19.415931 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-notification-agent" containerID="cri-o://0ac793b16473abff4126e0889f2b7929ca4fca314094cc8a41f73c19fbe0314c" gracePeriod=30 Mar 13 12:11:19 crc kubenswrapper[4786]: I0313 12:11:19.415939 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="sg-core" containerID="cri-o://2b0f393b507180cb42858e60aba608f5adfb8a858cae057e57251cb75f11a6ec" gracePeriod=30 Mar 13 12:11:19 crc kubenswrapper[4786]: I0313 12:11:19.415954 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="proxy-httpd" containerID="cri-o://925f7968dfead0196e87a2fbaeb59ac9773886541441c662666b1e4b66cada73" gracePeriod=30 Mar 13 12:11:19 crc kubenswrapper[4786]: I0313 12:11:19.442477 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.778165845 podStartE2EDuration="6.44245523s" podCreationTimestamp="2026-03-13 12:11:13 +0000 UTC" firstStartedPulling="2026-03-13 12:11:14.479017084 +0000 UTC m=+1461.758670531" lastFinishedPulling="2026-03-13 12:11:18.143306469 +0000 UTC m=+1465.422959916" observedRunningTime="2026-03-13 12:11:19.442286195 +0000 UTC m=+1466.721939642" watchObservedRunningTime="2026-03-13 12:11:19.44245523 +0000 UTC m=+1466.722108727" Mar 13 12:11:20 crc kubenswrapper[4786]: I0313 12:11:20.438010 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerID="925f7968dfead0196e87a2fbaeb59ac9773886541441c662666b1e4b66cada73" exitCode=0 Mar 13 12:11:20 crc kubenswrapper[4786]: I0313 12:11:20.438057 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerID="2b0f393b507180cb42858e60aba608f5adfb8a858cae057e57251cb75f11a6ec" exitCode=2 Mar 13 12:11:20 crc kubenswrapper[4786]: I0313 12:11:20.438072 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerID="0ac793b16473abff4126e0889f2b7929ca4fca314094cc8a41f73c19fbe0314c" exitCode=0 Mar 13 12:11:20 crc kubenswrapper[4786]: I0313 12:11:20.438118 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerDied","Data":"925f7968dfead0196e87a2fbaeb59ac9773886541441c662666b1e4b66cada73"} Mar 13 12:11:20 crc kubenswrapper[4786]: I0313 12:11:20.438155 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerDied","Data":"2b0f393b507180cb42858e60aba608f5adfb8a858cae057e57251cb75f11a6ec"} Mar 13 12:11:20 crc kubenswrapper[4786]: I0313 12:11:20.438174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerDied","Data":"0ac793b16473abff4126e0889f2b7929ca4fca314094cc8a41f73c19fbe0314c"} Mar 13 12:11:25 crc kubenswrapper[4786]: I0313 12:11:25.498196 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerID="b5d51382073a9ddcbeae9789fb275da029227523f126671bdc23a3dd3e89eac0" exitCode=0 Mar 13 12:11:25 crc kubenswrapper[4786]: I0313 12:11:25.498266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerDied","Data":"b5d51382073a9ddcbeae9789fb275da029227523f126671bdc23a3dd3e89eac0"} Mar 13 12:11:25 crc kubenswrapper[4786]: I0313 12:11:25.835513 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.030121 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-scripts\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031234 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-log-httpd\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031266 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-652fc\" (UniqueName: \"kubernetes.io/projected/2b5f7365-eee1-4011-8580-8aed69f6a457-kube-api-access-652fc\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031317 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-combined-ca-bundle\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-config-data\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-sg-core-conf-yaml\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031579 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-run-httpd\") pod \"2b5f7365-eee1-4011-8580-8aed69f6a457\" (UID: \"2b5f7365-eee1-4011-8580-8aed69f6a457\") " Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.031670 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.032026 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.032041 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.037312 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5f7365-eee1-4011-8580-8aed69f6a457-kube-api-access-652fc" (OuterVolumeSpecName: "kube-api-access-652fc") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "kube-api-access-652fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.038479 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-scripts" (OuterVolumeSpecName: "scripts") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.063817 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.127269 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.127957 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-config-data" (OuterVolumeSpecName: "config-data") pod "2b5f7365-eee1-4011-8580-8aed69f6a457" (UID: "2b5f7365-eee1-4011-8580-8aed69f6a457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.133188 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.133208 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-652fc\" (UniqueName: \"kubernetes.io/projected/2b5f7365-eee1-4011-8580-8aed69f6a457-kube-api-access-652fc\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.133219 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.133228 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.133236 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b5f7365-eee1-4011-8580-8aed69f6a457-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.133244 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b5f7365-eee1-4011-8580-8aed69f6a457-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.509956 4786 generic.go:334] "Generic (PLEG): container finished" podID="085820e1-a384-4656-8200-bb5ae71491ae" containerID="a4508a806bab3c42762b33d4b026a3718e0e71d0df4154080451f18edf45ef82" exitCode=0 Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.510022 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" event={"ID":"085820e1-a384-4656-8200-bb5ae71491ae","Type":"ContainerDied","Data":"a4508a806bab3c42762b33d4b026a3718e0e71d0df4154080451f18edf45ef82"} Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.513829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b5f7365-eee1-4011-8580-8aed69f6a457","Type":"ContainerDied","Data":"7f469fb4fe646a510979c6bf5cc1c54e57d89d9bd46b255eefd7481682b4dd47"} Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.513874 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.513936 4786 scope.go:117] "RemoveContainer" containerID="925f7968dfead0196e87a2fbaeb59ac9773886541441c662666b1e4b66cada73" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.546713 4786 scope.go:117] "RemoveContainer" containerID="2b0f393b507180cb42858e60aba608f5adfb8a858cae057e57251cb75f11a6ec" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.552276 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.559759 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.567553 4786 scope.go:117] "RemoveContainer" containerID="0ac793b16473abff4126e0889f2b7929ca4fca314094cc8a41f73c19fbe0314c" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.573049 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:26 crc kubenswrapper[4786]: E0313 12:11:26.573759 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="sg-core" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.573864 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="sg-core" Mar 13 12:11:26 crc kubenswrapper[4786]: E0313 12:11:26.573972 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-notification-agent" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.574048 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-notification-agent" Mar 13 12:11:26 crc kubenswrapper[4786]: E0313 12:11:26.574141 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-central-agent" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.574205 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-central-agent" Mar 13 12:11:26 crc kubenswrapper[4786]: E0313 12:11:26.574275 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="proxy-httpd" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.574347 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="proxy-httpd" Mar 13 12:11:26 crc kubenswrapper[4786]: E0313 12:11:26.574435 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-log" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.574518 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-log" Mar 13 12:11:26 crc kubenswrapper[4786]: E0313 12:11:26.574629 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-api" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.574707 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-api" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.574991 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="sg-core" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.575079 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-api" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.575180 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56e5879-af0d-47cc-8ce9-0bc5437c77f3" containerName="placement-log" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.575558 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-notification-agent" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.575632 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="ceilometer-central-agent" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.575703 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" containerName="proxy-httpd" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.578114 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.581184 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.588263 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.601118 4786 scope.go:117] "RemoveContainer" containerID="b5d51382073a9ddcbeae9789fb275da029227523f126671bdc23a3dd3e89eac0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.637767 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744677 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-run-httpd\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744787 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqmqj\" (UniqueName: \"kubernetes.io/projected/5b8e8868-a319-487c-b1d8-8070f652b3cf-kube-api-access-qqmqj\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-log-httpd\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744866 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-scripts\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744913 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-config-data\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.744937 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846014 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-log-httpd\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846295 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-scripts\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846323 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-config-data\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-run-httpd\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-log-httpd\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.846992 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-run-httpd\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.847038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.847059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqmqj\" (UniqueName: \"kubernetes.io/projected/5b8e8868-a319-487c-b1d8-8070f652b3cf-kube-api-access-qqmqj\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.851528 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-scripts\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.855273 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.862341 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-config-data\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.868368 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.870061 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqmqj\" (UniqueName: \"kubernetes.io/projected/5b8e8868-a319-487c-b1d8-8070f652b3cf-kube-api-access-qqmqj\") pod \"ceilometer-0\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " pod="openstack/ceilometer-0" Mar 13 12:11:26 crc kubenswrapper[4786]: I0313 12:11:26.912936 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:11:27 crc kubenswrapper[4786]: I0313 12:11:27.372942 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:11:27 crc kubenswrapper[4786]: W0313 12:11:27.380325 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b8e8868_a319_487c_b1d8_8070f652b3cf.slice/crio-7ff2e7a779513c5b7af110ede2737e06fd673ec7a644d04884b3e141d8f2866e WatchSource:0}: Error finding container 7ff2e7a779513c5b7af110ede2737e06fd673ec7a644d04884b3e141d8f2866e: Status 404 returned error can't find the container with id 7ff2e7a779513c5b7af110ede2737e06fd673ec7a644d04884b3e141d8f2866e Mar 13 12:11:27 crc kubenswrapper[4786]: I0313 12:11:27.453253 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5f7365-eee1-4011-8580-8aed69f6a457" path="/var/lib/kubelet/pods/2b5f7365-eee1-4011-8580-8aed69f6a457/volumes" Mar 13 12:11:27 crc kubenswrapper[4786]: I0313 12:11:27.530468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerStarted","Data":"7ff2e7a779513c5b7af110ede2737e06fd673ec7a644d04884b3e141d8f2866e"} Mar 13 12:11:27 crc kubenswrapper[4786]: I0313 12:11:27.929831 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.100105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnbmv\" (UniqueName: \"kubernetes.io/projected/085820e1-a384-4656-8200-bb5ae71491ae-kube-api-access-pnbmv\") pod \"085820e1-a384-4656-8200-bb5ae71491ae\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.100295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-combined-ca-bundle\") pod \"085820e1-a384-4656-8200-bb5ae71491ae\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.100347 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-scripts\") pod \"085820e1-a384-4656-8200-bb5ae71491ae\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.100584 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-config-data\") pod \"085820e1-a384-4656-8200-bb5ae71491ae\" (UID: \"085820e1-a384-4656-8200-bb5ae71491ae\") " Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.104550 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085820e1-a384-4656-8200-bb5ae71491ae-kube-api-access-pnbmv" (OuterVolumeSpecName: "kube-api-access-pnbmv") pod "085820e1-a384-4656-8200-bb5ae71491ae" (UID: "085820e1-a384-4656-8200-bb5ae71491ae"). InnerVolumeSpecName "kube-api-access-pnbmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.106182 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-scripts" (OuterVolumeSpecName: "scripts") pod "085820e1-a384-4656-8200-bb5ae71491ae" (UID: "085820e1-a384-4656-8200-bb5ae71491ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.128630 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-config-data" (OuterVolumeSpecName: "config-data") pod "085820e1-a384-4656-8200-bb5ae71491ae" (UID: "085820e1-a384-4656-8200-bb5ae71491ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.132919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "085820e1-a384-4656-8200-bb5ae71491ae" (UID: "085820e1-a384-4656-8200-bb5ae71491ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.204356 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.204411 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnbmv\" (UniqueName: \"kubernetes.io/projected/085820e1-a384-4656-8200-bb5ae71491ae-kube-api-access-pnbmv\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.204438 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.204457 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085820e1-a384-4656-8200-bb5ae71491ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.539751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerStarted","Data":"79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887"} Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.541282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" event={"ID":"085820e1-a384-4656-8200-bb5ae71491ae","Type":"ContainerDied","Data":"d652f5dd5f904d419a3f74bc1be78fecfc5d2e49874a87bc9625dad4b919e005"} Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.541309 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jqvv" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.541318 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d652f5dd5f904d419a3f74bc1be78fecfc5d2e49874a87bc9625dad4b919e005" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.688541 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:11:28 crc kubenswrapper[4786]: E0313 12:11:28.688994 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085820e1-a384-4656-8200-bb5ae71491ae" containerName="nova-cell0-conductor-db-sync" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.689014 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="085820e1-a384-4656-8200-bb5ae71491ae" containerName="nova-cell0-conductor-db-sync" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.689243 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="085820e1-a384-4656-8200-bb5ae71491ae" containerName="nova-cell0-conductor-db-sync" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.689997 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.692258 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bx625" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.692434 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.707600 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.715097 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8h2f\" (UniqueName: \"kubernetes.io/projected/116541e7-d92f-48ff-ad78-7dba2f45fc18-kube-api-access-g8h2f\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.715417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.715692 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.816741 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8h2f\" (UniqueName: \"kubernetes.io/projected/116541e7-d92f-48ff-ad78-7dba2f45fc18-kube-api-access-g8h2f\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.816806 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.816866 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.824207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.832681 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:28 crc kubenswrapper[4786]: I0313 12:11:28.836107 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8h2f\" (UniqueName: \"kubernetes.io/projected/116541e7-d92f-48ff-ad78-7dba2f45fc18-kube-api-access-g8h2f\") pod \"nova-cell0-conductor-0\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:29 crc kubenswrapper[4786]: I0313 12:11:29.082725 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:29 crc kubenswrapper[4786]: I0313 12:11:29.555293 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerStarted","Data":"17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7"} Mar 13 12:11:29 crc kubenswrapper[4786]: I0313 12:11:29.572267 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:11:29 crc kubenswrapper[4786]: W0313 12:11:29.586117 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116541e7_d92f_48ff_ad78_7dba2f45fc18.slice/crio-ec15dc619d7e4c9c7de040774d7a5874bebf4d51d48d5087c5449ade950574bc WatchSource:0}: Error finding container ec15dc619d7e4c9c7de040774d7a5874bebf4d51d48d5087c5449ade950574bc: Status 404 returned error can't find the container with id ec15dc619d7e4c9c7de040774d7a5874bebf4d51d48d5087c5449ade950574bc Mar 13 12:11:30 crc kubenswrapper[4786]: I0313 12:11:30.568247 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerStarted","Data":"d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a"} Mar 13 12:11:30 crc kubenswrapper[4786]: I0313 12:11:30.570217 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"116541e7-d92f-48ff-ad78-7dba2f45fc18","Type":"ContainerStarted","Data":"9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3"} Mar 13 12:11:30 crc kubenswrapper[4786]: I0313 12:11:30.570255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"116541e7-d92f-48ff-ad78-7dba2f45fc18","Type":"ContainerStarted","Data":"ec15dc619d7e4c9c7de040774d7a5874bebf4d51d48d5087c5449ade950574bc"} Mar 13 12:11:30 crc kubenswrapper[4786]: I0313 12:11:30.570394 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:30 crc kubenswrapper[4786]: I0313 12:11:30.597109 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.597081676 podStartE2EDuration="2.597081676s" podCreationTimestamp="2026-03-13 12:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:30.587639479 +0000 UTC m=+1477.867292946" watchObservedRunningTime="2026-03-13 12:11:30.597081676 +0000 UTC m=+1477.876735163" Mar 13 12:11:31 crc kubenswrapper[4786]: I0313 12:11:31.581805 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerStarted","Data":"494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c"} Mar 13 12:11:31 crc kubenswrapper[4786]: I0313 12:11:31.613124 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.784127771 podStartE2EDuration="5.613103045s" podCreationTimestamp="2026-03-13 12:11:26 +0000 UTC" firstStartedPulling="2026-03-13 12:11:27.383615593 +0000 UTC m=+1474.663269080" lastFinishedPulling="2026-03-13 12:11:31.212590897 +0000 UTC m=+1478.492244354" observedRunningTime="2026-03-13 12:11:31.600421411 +0000 UTC m=+1478.880074898" watchObservedRunningTime="2026-03-13 12:11:31.613103045 +0000 UTC m=+1478.892756512" Mar 13 12:11:32 crc kubenswrapper[4786]: I0313 12:11:32.591468 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.140067 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.753696 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vrgnd"] Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.755033 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.758106 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.760635 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.768469 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrgnd"] Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.851990 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-config-data\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.852132 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-scripts\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.852175 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm664\" (UniqueName: \"kubernetes.io/projected/6990d3ed-4503-4d9c-9f56-7b21a9abb203-kube-api-access-cm664\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.852208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.920028 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.921290 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.927237 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.932273 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956405 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-scripts\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956458 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm664\" (UniqueName: \"kubernetes.io/projected/6990d3ed-4503-4d9c-9f56-7b21a9abb203-kube-api-access-cm664\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956482 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-config-data\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956578 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clzx\" (UniqueName: \"kubernetes.io/projected/a4b50041-a1e3-47d1-903d-65a52e52dff2-kube-api-access-5clzx\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956613 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-config-data\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.956629 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.976902 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-config-data\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.977356 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-scripts\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.986393 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm664\" (UniqueName: \"kubernetes.io/projected/6990d3ed-4503-4d9c-9f56-7b21a9abb203-kube-api-access-cm664\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:34 crc kubenswrapper[4786]: I0313 12:11:34.996679 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vrgnd\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.061667 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-config-data\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.061829 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clzx\" (UniqueName: \"kubernetes.io/projected/a4b50041-a1e3-47d1-903d-65a52e52dff2-kube-api-access-5clzx\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.061929 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.070255 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-config-data\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.072497 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.072925 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.098083 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.103185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.117383 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.118485 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clzx\" (UniqueName: \"kubernetes.io/projected/a4b50041-a1e3-47d1-903d-65a52e52dff2-kube-api-access-5clzx\") pod \"nova-scheduler-0\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.123122 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.154526 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.156125 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.157838 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163690 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163760 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-config-data\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163800 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163818 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f833f08b-4f2e-4c3d-b754-4fa14a01a553-logs\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p79h\" (UniqueName: \"kubernetes.io/projected/10a1967f-0614-4958-a74b-38b1ad0b1889-kube-api-access-9p79h\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163860 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a1967f-0614-4958-a74b-38b1ad0b1889-logs\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163900 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts6mp\" (UniqueName: \"kubernetes.io/projected/f833f08b-4f2e-4c3d-b754-4fa14a01a553-kube-api-access-ts6mp\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.163966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-config-data\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.199527 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.239014 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.259161 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.260260 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.262487 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265572 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-config-data\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265692 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-config-data\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265730 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265753 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f833f08b-4f2e-4c3d-b754-4fa14a01a553-logs\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265773 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p79h\" (UniqueName: \"kubernetes.io/projected/10a1967f-0614-4958-a74b-38b1ad0b1889-kube-api-access-9p79h\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265793 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a1967f-0614-4958-a74b-38b1ad0b1889-logs\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.265812 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts6mp\" (UniqueName: \"kubernetes.io/projected/f833f08b-4f2e-4c3d-b754-4fa14a01a553-kube-api-access-ts6mp\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.268252 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f833f08b-4f2e-4c3d-b754-4fa14a01a553-logs\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.269602 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a1967f-0614-4958-a74b-38b1ad0b1889-logs\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.270917 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-config-data\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.274146 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.274462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.279452 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-config-data\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.283017 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.292326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p79h\" (UniqueName: \"kubernetes.io/projected/10a1967f-0614-4958-a74b-38b1ad0b1889-kube-api-access-9p79h\") pod \"nova-api-0\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.292326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts6mp\" (UniqueName: \"kubernetes.io/projected/f833f08b-4f2e-4c3d-b754-4fa14a01a553-kube-api-access-ts6mp\") pod \"nova-metadata-0\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.316432 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-kl8l2"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.317987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.334400 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-kl8l2"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367267 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367332 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367377 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367443 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sww26\" (UniqueName: \"kubernetes.io/projected/6c25ada4-043b-4351-85c9-87f967f842bb-kube-api-access-sww26\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7mz\" (UniqueName: \"kubernetes.io/projected/d84e3cb0-36a5-411a-9463-c9237f1eb943-kube-api-access-tr7mz\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367666 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-config\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367845 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.367953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-svc\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.368066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-swift-storage-0\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470358 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-swift-storage-0\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470706 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470784 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470835 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sww26\" (UniqueName: \"kubernetes.io/projected/6c25ada4-043b-4351-85c9-87f967f842bb-kube-api-access-sww26\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470861 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7mz\" (UniqueName: \"kubernetes.io/projected/d84e3cb0-36a5-411a-9463-c9237f1eb943-kube-api-access-tr7mz\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470928 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-config\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.470995 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.471038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-svc\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.471905 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.472007 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-swift-storage-0\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.474678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.474874 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-svc\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.479735 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-config\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.481576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.490236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.502687 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sww26\" (UniqueName: \"kubernetes.io/projected/6c25ada4-043b-4351-85c9-87f967f842bb-kube-api-access-sww26\") pod \"dnsmasq-dns-5b74b5cfd5-kl8l2\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.507595 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7mz\" (UniqueName: \"kubernetes.io/projected/d84e3cb0-36a5-411a-9463-c9237f1eb943-kube-api-access-tr7mz\") pod \"nova-cell1-novncproxy-0\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.552502 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.573259 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.586930 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.638357 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.786974 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrgnd"] Mar 13 12:11:35 crc kubenswrapper[4786]: I0313 12:11:35.883609 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:35.998862 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.166083 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlf8"] Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.168078 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.174908 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.175035 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.189021 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlf8"] Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.206026 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-config-data\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.206082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.206140 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-scripts\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.206189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6ws\" (UniqueName: \"kubernetes.io/projected/49d674fa-8483-4cba-a0ad-49ebd1f68558-kube-api-access-2f6ws\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.308753 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-scripts\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.308859 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6ws\" (UniqueName: \"kubernetes.io/projected/49d674fa-8483-4cba-a0ad-49ebd1f68558-kube-api-access-2f6ws\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.308963 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-config-data\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.308998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.314371 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-scripts\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.318758 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-config-data\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.329609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.332911 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.335337 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6ws\" (UniqueName: \"kubernetes.io/projected/49d674fa-8483-4cba-a0ad-49ebd1f68558-kube-api-access-2f6ws\") pod \"nova-cell1-conductor-db-sync-2jlf8\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.340015 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:36 crc kubenswrapper[4786]: W0313 12:11:36.358890 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10a1967f_0614_4958_a74b_38b1ad0b1889.slice/crio-371b8bb8b73cb028ef96b6b66ad97bb514c0b1b709084e331d4a67e9be90c084 WatchSource:0}: Error finding container 371b8bb8b73cb028ef96b6b66ad97bb514c0b1b709084e331d4a67e9be90c084: Status 404 returned error can't find the container with id 371b8bb8b73cb028ef96b6b66ad97bb514c0b1b709084e331d4a67e9be90c084 Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.463598 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-kl8l2"] Mar 13 12:11:36 crc kubenswrapper[4786]: W0313 12:11:36.481176 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c25ada4_043b_4351_85c9_87f967f842bb.slice/crio-a4eadc3e9ca746f43a2ce59feaa21f512713da4d93930dbe6be96cac28306f3d WatchSource:0}: Error finding container a4eadc3e9ca746f43a2ce59feaa21f512713da4d93930dbe6be96cac28306f3d: Status 404 returned error can't find the container with id a4eadc3e9ca746f43a2ce59feaa21f512713da4d93930dbe6be96cac28306f3d Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.497717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.581730 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:11:36 crc kubenswrapper[4786]: W0313 12:11:36.595407 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd84e3cb0_36a5_411a_9463_c9237f1eb943.slice/crio-cdd4fc5c5151c78f26a5aebab2f1099029f9c3d7496855714957956ff00d005f WatchSource:0}: Error finding container cdd4fc5c5151c78f26a5aebab2f1099029f9c3d7496855714957956ff00d005f: Status 404 returned error can't find the container with id cdd4fc5c5151c78f26a5aebab2f1099029f9c3d7496855714957956ff00d005f Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.654218 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d84e3cb0-36a5-411a-9463-c9237f1eb943","Type":"ContainerStarted","Data":"cdd4fc5c5151c78f26a5aebab2f1099029f9c3d7496855714957956ff00d005f"} Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.658222 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4b50041-a1e3-47d1-903d-65a52e52dff2","Type":"ContainerStarted","Data":"1205455f9e6a3a6f70a747f1796eba44e99119ba73aa609c88bee62f97dc9470"} Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.659453 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" event={"ID":"6c25ada4-043b-4351-85c9-87f967f842bb","Type":"ContainerStarted","Data":"a4eadc3e9ca746f43a2ce59feaa21f512713da4d93930dbe6be96cac28306f3d"} Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.660671 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrgnd" event={"ID":"6990d3ed-4503-4d9c-9f56-7b21a9abb203","Type":"ContainerStarted","Data":"65837bac0546f59350c540914448df362ce0c0ef074546abc8ea87fee95af8a5"} Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.660690 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrgnd" event={"ID":"6990d3ed-4503-4d9c-9f56-7b21a9abb203","Type":"ContainerStarted","Data":"1429ff32860aac2f0237197c37d7e7ad9b3c883d823e9504e26de1a3fc5aa4f8"} Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.666280 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f833f08b-4f2e-4c3d-b754-4fa14a01a553","Type":"ContainerStarted","Data":"f0995fb3efa856d41c5a0faf2c7bfcb85e87f697d197e44c4a370a0de936926b"} Mar 13 12:11:36 crc kubenswrapper[4786]: I0313 12:11:36.668215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a1967f-0614-4958-a74b-38b1ad0b1889","Type":"ContainerStarted","Data":"371b8bb8b73cb028ef96b6b66ad97bb514c0b1b709084e331d4a67e9be90c084"} Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.043441 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vrgnd" podStartSLOduration=3.043425469 podStartE2EDuration="3.043425469s" podCreationTimestamp="2026-03-13 12:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:36.693138884 +0000 UTC m=+1483.972792321" watchObservedRunningTime="2026-03-13 12:11:37.043425469 +0000 UTC m=+1484.323078916" Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.065466 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlf8"] Mar 13 12:11:37 crc kubenswrapper[4786]: W0313 12:11:37.076776 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d674fa_8483_4cba_a0ad_49ebd1f68558.slice/crio-520539bd6435fa02bba29e6f1959093b49ffba6d95cde39c250c0c020308b653 WatchSource:0}: Error finding container 520539bd6435fa02bba29e6f1959093b49ffba6d95cde39c250c0c020308b653: Status 404 returned error can't find the container with id 520539bd6435fa02bba29e6f1959093b49ffba6d95cde39c250c0c020308b653 Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.688643 4786 generic.go:334] "Generic (PLEG): container finished" podID="6c25ada4-043b-4351-85c9-87f967f842bb" containerID="5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b" exitCode=0 Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.688757 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" event={"ID":"6c25ada4-043b-4351-85c9-87f967f842bb","Type":"ContainerDied","Data":"5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b"} Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.717224 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" event={"ID":"49d674fa-8483-4cba-a0ad-49ebd1f68558","Type":"ContainerStarted","Data":"e9fc0d6300970d7f8d6d459a014308012ae54ecd24efc235ec0affdfe4f8a2a3"} Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.717278 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" event={"ID":"49d674fa-8483-4cba-a0ad-49ebd1f68558","Type":"ContainerStarted","Data":"520539bd6435fa02bba29e6f1959093b49ffba6d95cde39c250c0c020308b653"} Mar 13 12:11:37 crc kubenswrapper[4786]: I0313 12:11:37.738772 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" podStartSLOduration=1.738751406 podStartE2EDuration="1.738751406s" podCreationTimestamp="2026-03-13 12:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:37.733061581 +0000 UTC m=+1485.012715028" watchObservedRunningTime="2026-03-13 12:11:37.738751406 +0000 UTC m=+1485.018404843" Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.169104 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.169439 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.169479 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.170072 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e7dfa6aebd4ca8695c470b1c4f1a2306b0f2eefc624c2d634686a5a8cd4e40b"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.170125 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://1e7dfa6aebd4ca8695c470b1c4f1a2306b0f2eefc624c2d634686a5a8cd4e40b" gracePeriod=600 Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.765490 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="1e7dfa6aebd4ca8695c470b1c4f1a2306b0f2eefc624c2d634686a5a8cd4e40b" exitCode=0 Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.765966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"1e7dfa6aebd4ca8695c470b1c4f1a2306b0f2eefc624c2d634686a5a8cd4e40b"} Mar 13 12:11:38 crc kubenswrapper[4786]: I0313 12:11:38.766006 4786 scope.go:117] "RemoveContainer" containerID="5656b6c6cc644913041fc5892205e2cc6f507fb238f0bcbc7956307710968e91" Mar 13 12:11:39 crc kubenswrapper[4786]: I0313 12:11:39.283655 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:39 crc kubenswrapper[4786]: I0313 12:11:39.296930 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.791863 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" event={"ID":"6c25ada4-043b-4351-85c9-87f967f842bb","Type":"ContainerStarted","Data":"5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a"} Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.792350 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.794935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f833f08b-4f2e-4c3d-b754-4fa14a01a553","Type":"ContainerStarted","Data":"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6"} Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.796177 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a1967f-0614-4958-a74b-38b1ad0b1889","Type":"ContainerStarted","Data":"ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8"} Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.799008 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d84e3cb0-36a5-411a-9463-c9237f1eb943","Type":"ContainerStarted","Data":"b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4"} Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.799133 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d84e3cb0-36a5-411a-9463-c9237f1eb943" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4" gracePeriod=30 Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.803610 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754"} Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.805075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4b50041-a1e3-47d1-903d-65a52e52dff2","Type":"ContainerStarted","Data":"d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112"} Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.867730 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" podStartSLOduration=5.867712455 podStartE2EDuration="5.867712455s" podCreationTimestamp="2026-03-13 12:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:40.832153291 +0000 UTC m=+1488.111806748" watchObservedRunningTime="2026-03-13 12:11:40.867712455 +0000 UTC m=+1488.147365902" Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.897658 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.921191832 podStartE2EDuration="6.897636238s" podCreationTimestamp="2026-03-13 12:11:34 +0000 UTC" firstStartedPulling="2026-03-13 12:11:35.998604049 +0000 UTC m=+1483.278257496" lastFinishedPulling="2026-03-13 12:11:39.975048455 +0000 UTC m=+1487.254701902" observedRunningTime="2026-03-13 12:11:40.885499108 +0000 UTC m=+1488.165152555" watchObservedRunningTime="2026-03-13 12:11:40.897636238 +0000 UTC m=+1488.177289685" Mar 13 12:11:40 crc kubenswrapper[4786]: I0313 12:11:40.919287 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.541850753 podStartE2EDuration="5.919259884s" podCreationTimestamp="2026-03-13 12:11:35 +0000 UTC" firstStartedPulling="2026-03-13 12:11:36.597653634 +0000 UTC m=+1483.877307151" lastFinishedPulling="2026-03-13 12:11:39.975062815 +0000 UTC m=+1487.254716282" observedRunningTime="2026-03-13 12:11:40.911409832 +0000 UTC m=+1488.191063279" watchObservedRunningTime="2026-03-13 12:11:40.919259884 +0000 UTC m=+1488.198913331" Mar 13 12:11:41 crc kubenswrapper[4786]: I0313 12:11:41.815260 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-log" containerID="cri-o://f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6" gracePeriod=30 Mar 13 12:11:41 crc kubenswrapper[4786]: I0313 12:11:41.815957 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-metadata" containerID="cri-o://8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151" gracePeriod=30 Mar 13 12:11:41 crc kubenswrapper[4786]: I0313 12:11:41.815728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f833f08b-4f2e-4c3d-b754-4fa14a01a553","Type":"ContainerStarted","Data":"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151"} Mar 13 12:11:41 crc kubenswrapper[4786]: I0313 12:11:41.817401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a1967f-0614-4958-a74b-38b1ad0b1889","Type":"ContainerStarted","Data":"6f64281da4aa8cf483b737b3b0a3c14552d8ab3d54b5e412e6af976479c43180"} Mar 13 12:11:41 crc kubenswrapper[4786]: I0313 12:11:41.861552 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.208149852 podStartE2EDuration="6.861529531s" podCreationTimestamp="2026-03-13 12:11:35 +0000 UTC" firstStartedPulling="2026-03-13 12:11:36.369726169 +0000 UTC m=+1483.649379616" lastFinishedPulling="2026-03-13 12:11:40.023105848 +0000 UTC m=+1487.302759295" observedRunningTime="2026-03-13 12:11:41.841528829 +0000 UTC m=+1489.121182296" watchObservedRunningTime="2026-03-13 12:11:41.861529531 +0000 UTC m=+1489.141182988" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.449687 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.480315 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.85082391 podStartE2EDuration="7.480291251s" podCreationTimestamp="2026-03-13 12:11:35 +0000 UTC" firstStartedPulling="2026-03-13 12:11:36.364654301 +0000 UTC m=+1483.644307748" lastFinishedPulling="2026-03-13 12:11:39.994121642 +0000 UTC m=+1487.273775089" observedRunningTime="2026-03-13 12:11:41.870370462 +0000 UTC m=+1489.150023919" watchObservedRunningTime="2026-03-13 12:11:42.480291251 +0000 UTC m=+1489.759944708" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.565072 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-config-data\") pod \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.565115 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f833f08b-4f2e-4c3d-b754-4fa14a01a553-logs\") pod \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.565283 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-combined-ca-bundle\") pod \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.565334 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts6mp\" (UniqueName: \"kubernetes.io/projected/f833f08b-4f2e-4c3d-b754-4fa14a01a553-kube-api-access-ts6mp\") pod \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\" (UID: \"f833f08b-4f2e-4c3d-b754-4fa14a01a553\") " Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.567394 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f833f08b-4f2e-4c3d-b754-4fa14a01a553-logs" (OuterVolumeSpecName: "logs") pod "f833f08b-4f2e-4c3d-b754-4fa14a01a553" (UID: "f833f08b-4f2e-4c3d-b754-4fa14a01a553"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.587601 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f833f08b-4f2e-4c3d-b754-4fa14a01a553-kube-api-access-ts6mp" (OuterVolumeSpecName: "kube-api-access-ts6mp") pod "f833f08b-4f2e-4c3d-b754-4fa14a01a553" (UID: "f833f08b-4f2e-4c3d-b754-4fa14a01a553"). InnerVolumeSpecName "kube-api-access-ts6mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.599065 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-config-data" (OuterVolumeSpecName: "config-data") pod "f833f08b-4f2e-4c3d-b754-4fa14a01a553" (UID: "f833f08b-4f2e-4c3d-b754-4fa14a01a553"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.629762 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f833f08b-4f2e-4c3d-b754-4fa14a01a553" (UID: "f833f08b-4f2e-4c3d-b754-4fa14a01a553"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.669934 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.669980 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts6mp\" (UniqueName: \"kubernetes.io/projected/f833f08b-4f2e-4c3d-b754-4fa14a01a553-kube-api-access-ts6mp\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.669998 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f833f08b-4f2e-4c3d-b754-4fa14a01a553-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.670010 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f833f08b-4f2e-4c3d-b754-4fa14a01a553-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838154 4786 generic.go:334] "Generic (PLEG): container finished" podID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerID="8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151" exitCode=0 Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838499 4786 generic.go:334] "Generic (PLEG): container finished" podID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerID="f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6" exitCode=143 Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838234 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838244 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f833f08b-4f2e-4c3d-b754-4fa14a01a553","Type":"ContainerDied","Data":"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151"} Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838579 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f833f08b-4f2e-4c3d-b754-4fa14a01a553","Type":"ContainerDied","Data":"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6"} Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838601 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f833f08b-4f2e-4c3d-b754-4fa14a01a553","Type":"ContainerDied","Data":"f0995fb3efa856d41c5a0faf2c7bfcb85e87f697d197e44c4a370a0de936926b"} Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.838616 4786 scope.go:117] "RemoveContainer" containerID="8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.887096 4786 scope.go:117] "RemoveContainer" containerID="f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.901330 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.921462 4786 scope.go:117] "RemoveContainer" containerID="8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151" Mar 13 12:11:42 crc kubenswrapper[4786]: E0313 12:11:42.921984 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151\": container with ID starting with 8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151 not found: ID does not exist" containerID="8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.922027 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151"} err="failed to get container status \"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151\": rpc error: code = NotFound desc = could not find container \"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151\": container with ID starting with 8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151 not found: ID does not exist" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.922055 4786 scope.go:117] "RemoveContainer" containerID="f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6" Mar 13 12:11:42 crc kubenswrapper[4786]: E0313 12:11:42.922413 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6\": container with ID starting with f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6 not found: ID does not exist" containerID="f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.922455 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6"} err="failed to get container status \"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6\": rpc error: code = NotFound desc = could not find container \"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6\": container with ID starting with f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6 not found: ID does not exist" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.922480 4786 scope.go:117] "RemoveContainer" containerID="8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.922783 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151"} err="failed to get container status \"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151\": rpc error: code = NotFound desc = could not find container \"8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151\": container with ID starting with 8555b472ea11f2cf4b3c1bca5f2b22ae6c8cd311d44c5204f6bae6680310b151 not found: ID does not exist" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.922810 4786 scope.go:117] "RemoveContainer" containerID="f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.923824 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6"} err="failed to get container status \"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6\": rpc error: code = NotFound desc = could not find container \"f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6\": container with ID starting with f7e8754bd53cca5d245a90c5bd3e89d0245dc0249f5049fc117cbe3c91983ac6 not found: ID does not exist" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.924486 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.932464 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:42 crc kubenswrapper[4786]: E0313 12:11:42.933025 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-metadata" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.933051 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-metadata" Mar 13 12:11:42 crc kubenswrapper[4786]: E0313 12:11:42.933074 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-log" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.933084 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-log" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.933363 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-metadata" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.933415 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" containerName="nova-metadata-log" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.934746 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.940470 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.943958 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:11:42 crc kubenswrapper[4786]: I0313 12:11:42.944142 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.076550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.076693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2cbbd-f5cf-4ace-b384-94f994b05549-logs\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.076808 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.077677 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-config-data\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.077796 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wt2s\" (UniqueName: \"kubernetes.io/projected/1aa2cbbd-f5cf-4ace-b384-94f994b05549-kube-api-access-6wt2s\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.179331 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.179402 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2cbbd-f5cf-4ace-b384-94f994b05549-logs\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.179463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.179502 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-config-data\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.179542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wt2s\" (UniqueName: \"kubernetes.io/projected/1aa2cbbd-f5cf-4ace-b384-94f994b05549-kube-api-access-6wt2s\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.180113 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2cbbd-f5cf-4ace-b384-94f994b05549-logs\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.184521 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.187583 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.188029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-config-data\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.207759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wt2s\" (UniqueName: \"kubernetes.io/projected/1aa2cbbd-f5cf-4ace-b384-94f994b05549-kube-api-access-6wt2s\") pod \"nova-metadata-0\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.262007 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.453268 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f833f08b-4f2e-4c3d-b754-4fa14a01a553" path="/var/lib/kubelet/pods/f833f08b-4f2e-4c3d-b754-4fa14a01a553/volumes" Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.731155 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:43 crc kubenswrapper[4786]: I0313 12:11:43.850673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aa2cbbd-f5cf-4ace-b384-94f994b05549","Type":"ContainerStarted","Data":"46f523613f6ad97fcb8f0e5cea1226e695ac97ea7b1e9fdd76fe0ab440e4081b"} Mar 13 12:11:44 crc kubenswrapper[4786]: I0313 12:11:44.871808 4786 generic.go:334] "Generic (PLEG): container finished" podID="6990d3ed-4503-4d9c-9f56-7b21a9abb203" containerID="65837bac0546f59350c540914448df362ce0c0ef074546abc8ea87fee95af8a5" exitCode=0 Mar 13 12:11:44 crc kubenswrapper[4786]: I0313 12:11:44.871952 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrgnd" event={"ID":"6990d3ed-4503-4d9c-9f56-7b21a9abb203","Type":"ContainerDied","Data":"65837bac0546f59350c540914448df362ce0c0ef074546abc8ea87fee95af8a5"} Mar 13 12:11:44 crc kubenswrapper[4786]: I0313 12:11:44.875708 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aa2cbbd-f5cf-4ace-b384-94f994b05549","Type":"ContainerStarted","Data":"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc"} Mar 13 12:11:44 crc kubenswrapper[4786]: I0313 12:11:44.875766 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aa2cbbd-f5cf-4ace-b384-94f994b05549","Type":"ContainerStarted","Data":"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744"} Mar 13 12:11:44 crc kubenswrapper[4786]: I0313 12:11:44.937375 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.93734653 podStartE2EDuration="2.93734653s" podCreationTimestamp="2026-03-13 12:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:44.919964819 +0000 UTC m=+1492.199618346" watchObservedRunningTime="2026-03-13 12:11:44.93734653 +0000 UTC m=+1492.217000007" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.239595 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.239917 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.277797 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.573942 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.574021 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.588564 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.640235 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.735289 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-5628g"] Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.735529 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5547746bbf-5628g" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerName="dnsmasq-dns" containerID="cri-o://2e436c466c2593e6599542d39b946854245f5133ce60a7d4917eab893b67f827" gracePeriod=10 Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.892187 4786 generic.go:334] "Generic (PLEG): container finished" podID="49d674fa-8483-4cba-a0ad-49ebd1f68558" containerID="e9fc0d6300970d7f8d6d459a014308012ae54ecd24efc235ec0affdfe4f8a2a3" exitCode=0 Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.892326 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" event={"ID":"49d674fa-8483-4cba-a0ad-49ebd1f68558","Type":"ContainerDied","Data":"e9fc0d6300970d7f8d6d459a014308012ae54ecd24efc235ec0affdfe4f8a2a3"} Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.899201 4786 generic.go:334] "Generic (PLEG): container finished" podID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerID="2e436c466c2593e6599542d39b946854245f5133ce60a7d4917eab893b67f827" exitCode=0 Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.900256 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-5628g" event={"ID":"018af6b4-883b-4532-a65f-58f8b6e00b39","Type":"ContainerDied","Data":"2e436c466c2593e6599542d39b946854245f5133ce60a7d4917eab893b67f827"} Mar 13 12:11:45 crc kubenswrapper[4786]: I0313 12:11:45.936339 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.384797 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.391283 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.569939 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-sb\") pod \"018af6b4-883b-4532-a65f-58f8b6e00b39\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.569991 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm664\" (UniqueName: \"kubernetes.io/projected/6990d3ed-4503-4d9c-9f56-7b21a9abb203-kube-api-access-cm664\") pod \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570082 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-config\") pod \"018af6b4-883b-4532-a65f-58f8b6e00b39\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570196 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-combined-ca-bundle\") pod \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570263 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-swift-storage-0\") pod \"018af6b4-883b-4532-a65f-58f8b6e00b39\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570298 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-scripts\") pod \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570335 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-svc\") pod \"018af6b4-883b-4532-a65f-58f8b6e00b39\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-nb\") pod \"018af6b4-883b-4532-a65f-58f8b6e00b39\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570435 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/018af6b4-883b-4532-a65f-58f8b6e00b39-kube-api-access-m8gkl\") pod \"018af6b4-883b-4532-a65f-58f8b6e00b39\" (UID: \"018af6b4-883b-4532-a65f-58f8b6e00b39\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.570464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-config-data\") pod \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\" (UID: \"6990d3ed-4503-4d9c-9f56-7b21a9abb203\") " Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.574113 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.595070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-scripts" (OuterVolumeSpecName: "scripts") pod "6990d3ed-4503-4d9c-9f56-7b21a9abb203" (UID: "6990d3ed-4503-4d9c-9f56-7b21a9abb203"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.595163 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018af6b4-883b-4532-a65f-58f8b6e00b39-kube-api-access-m8gkl" (OuterVolumeSpecName: "kube-api-access-m8gkl") pod "018af6b4-883b-4532-a65f-58f8b6e00b39" (UID: "018af6b4-883b-4532-a65f-58f8b6e00b39"). InnerVolumeSpecName "kube-api-access-m8gkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.601101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6990d3ed-4503-4d9c-9f56-7b21a9abb203-kube-api-access-cm664" (OuterVolumeSpecName: "kube-api-access-cm664") pod "6990d3ed-4503-4d9c-9f56-7b21a9abb203" (UID: "6990d3ed-4503-4d9c-9f56-7b21a9abb203"). InnerVolumeSpecName "kube-api-access-cm664". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.606288 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6990d3ed-4503-4d9c-9f56-7b21a9abb203" (UID: "6990d3ed-4503-4d9c-9f56-7b21a9abb203"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.616162 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.630984 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-config-data" (OuterVolumeSpecName: "config-data") pod "6990d3ed-4503-4d9c-9f56-7b21a9abb203" (UID: "6990d3ed-4503-4d9c-9f56-7b21a9abb203"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.634743 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-config" (OuterVolumeSpecName: "config") pod "018af6b4-883b-4532-a65f-58f8b6e00b39" (UID: "018af6b4-883b-4532-a65f-58f8b6e00b39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.644254 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "018af6b4-883b-4532-a65f-58f8b6e00b39" (UID: "018af6b4-883b-4532-a65f-58f8b6e00b39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.644386 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "018af6b4-883b-4532-a65f-58f8b6e00b39" (UID: "018af6b4-883b-4532-a65f-58f8b6e00b39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.649240 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "018af6b4-883b-4532-a65f-58f8b6e00b39" (UID: "018af6b4-883b-4532-a65f-58f8b6e00b39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672733 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672772 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672782 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672790 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672800 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/018af6b4-883b-4532-a65f-58f8b6e00b39-kube-api-access-m8gkl\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672810 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672821 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm664\" (UniqueName: \"kubernetes.io/projected/6990d3ed-4503-4d9c-9f56-7b21a9abb203-kube-api-access-cm664\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672829 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.672837 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6990d3ed-4503-4d9c-9f56-7b21a9abb203-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.682709 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "018af6b4-883b-4532-a65f-58f8b6e00b39" (UID: "018af6b4-883b-4532-a65f-58f8b6e00b39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.774548 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/018af6b4-883b-4532-a65f-58f8b6e00b39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.914342 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrgnd" event={"ID":"6990d3ed-4503-4d9c-9f56-7b21a9abb203","Type":"ContainerDied","Data":"1429ff32860aac2f0237197c37d7e7ad9b3c883d823e9504e26de1a3fc5aa4f8"} Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.914389 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1429ff32860aac2f0237197c37d7e7ad9b3c883d823e9504e26de1a3fc5aa4f8" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.914390 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrgnd" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.917988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5547746bbf-5628g" event={"ID":"018af6b4-883b-4532-a65f-58f8b6e00b39","Type":"ContainerDied","Data":"d488ede029ccab7ae0016686e8917bf4d74c3909fcf86e27293438eb9bb3b67a"} Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.918051 4786 scope.go:117] "RemoveContainer" containerID="2e436c466c2593e6599542d39b946854245f5133ce60a7d4917eab893b67f827" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.918185 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5547746bbf-5628g" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.987357 4786 scope.go:117] "RemoveContainer" containerID="19c9ff3e830a432f46800afce68efbb91d132b1e93d7254352532633446da752" Mar 13 12:11:46 crc kubenswrapper[4786]: I0313 12:11:46.992473 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-5628g"] Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.003700 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5547746bbf-5628g"] Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.102219 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.120002 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.120245 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-log" containerID="cri-o://ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8" gracePeriod=30 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.120439 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-api" containerID="cri-o://6f64281da4aa8cf483b737b3b0a3c14552d8ab3d54b5e412e6af976479c43180" gracePeriod=30 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.131412 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.131692 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-log" containerID="cri-o://95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744" gracePeriod=30 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.132220 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-metadata" containerID="cri-o://c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc" gracePeriod=30 Mar 13 12:11:47 crc kubenswrapper[4786]: E0313 12:11:47.352375 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10a1967f_0614_4958_a74b_38b1ad0b1889.slice/crio-conmon-ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa2cbbd_f5cf_4ace_b384_94f994b05549.slice/crio-95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10a1967f_0614_4958_a74b_38b1ad0b1889.slice/crio-ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8.scope\": RecentStats: unable to find data in memory cache]" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.429714 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.472919 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" path="/var/lib/kubelet/pods/018af6b4-883b-4532-a65f-58f8b6e00b39/volumes" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.648373 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-combined-ca-bundle\") pod \"49d674fa-8483-4cba-a0ad-49ebd1f68558\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.648469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-scripts\") pod \"49d674fa-8483-4cba-a0ad-49ebd1f68558\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.648570 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-config-data\") pod \"49d674fa-8483-4cba-a0ad-49ebd1f68558\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.648710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f6ws\" (UniqueName: \"kubernetes.io/projected/49d674fa-8483-4cba-a0ad-49ebd1f68558-kube-api-access-2f6ws\") pod \"49d674fa-8483-4cba-a0ad-49ebd1f68558\" (UID: \"49d674fa-8483-4cba-a0ad-49ebd1f68558\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.664754 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-scripts" (OuterVolumeSpecName: "scripts") pod "49d674fa-8483-4cba-a0ad-49ebd1f68558" (UID: "49d674fa-8483-4cba-a0ad-49ebd1f68558"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.672086 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d674fa-8483-4cba-a0ad-49ebd1f68558-kube-api-access-2f6ws" (OuterVolumeSpecName: "kube-api-access-2f6ws") pod "49d674fa-8483-4cba-a0ad-49ebd1f68558" (UID: "49d674fa-8483-4cba-a0ad-49ebd1f68558"). InnerVolumeSpecName "kube-api-access-2f6ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.696038 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d674fa-8483-4cba-a0ad-49ebd1f68558" (UID: "49d674fa-8483-4cba-a0ad-49ebd1f68558"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.717816 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.722047 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-config-data" (OuterVolumeSpecName: "config-data") pod "49d674fa-8483-4cba-a0ad-49ebd1f68558" (UID: "49d674fa-8483-4cba-a0ad-49ebd1f68558"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.751132 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.751162 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f6ws\" (UniqueName: \"kubernetes.io/projected/49d674fa-8483-4cba-a0ad-49ebd1f68558-kube-api-access-2f6ws\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.751171 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.751180 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d674fa-8483-4cba-a0ad-49ebd1f68558-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852160 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2cbbd-f5cf-4ace-b384-94f994b05549-logs\") pod \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852356 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-nova-metadata-tls-certs\") pod \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-config-data\") pod \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852470 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-combined-ca-bundle\") pod \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852490 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wt2s\" (UniqueName: \"kubernetes.io/projected/1aa2cbbd-f5cf-4ace-b384-94f994b05549-kube-api-access-6wt2s\") pod \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\" (UID: \"1aa2cbbd-f5cf-4ace-b384-94f994b05549\") " Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852517 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa2cbbd-f5cf-4ace-b384-94f994b05549-logs" (OuterVolumeSpecName: "logs") pod "1aa2cbbd-f5cf-4ace-b384-94f994b05549" (UID: "1aa2cbbd-f5cf-4ace-b384-94f994b05549"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.852984 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2cbbd-f5cf-4ace-b384-94f994b05549-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.857010 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa2cbbd-f5cf-4ace-b384-94f994b05549-kube-api-access-6wt2s" (OuterVolumeSpecName: "kube-api-access-6wt2s") pod "1aa2cbbd-f5cf-4ace-b384-94f994b05549" (UID: "1aa2cbbd-f5cf-4ace-b384-94f994b05549"). InnerVolumeSpecName "kube-api-access-6wt2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.880264 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aa2cbbd-f5cf-4ace-b384-94f994b05549" (UID: "1aa2cbbd-f5cf-4ace-b384-94f994b05549"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.882925 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-config-data" (OuterVolumeSpecName: "config-data") pod "1aa2cbbd-f5cf-4ace-b384-94f994b05549" (UID: "1aa2cbbd-f5cf-4ace-b384-94f994b05549"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.895273 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1aa2cbbd-f5cf-4ace-b384-94f994b05549" (UID: "1aa2cbbd-f5cf-4ace-b384-94f994b05549"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935203 4786 generic.go:334] "Generic (PLEG): container finished" podID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerID="c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc" exitCode=0 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935241 4786 generic.go:334] "Generic (PLEG): container finished" podID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerID="95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744" exitCode=143 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935284 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aa2cbbd-f5cf-4ace-b384-94f994b05549","Type":"ContainerDied","Data":"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc"} Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aa2cbbd-f5cf-4ace-b384-94f994b05549","Type":"ContainerDied","Data":"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744"} Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aa2cbbd-f5cf-4ace-b384-94f994b05549","Type":"ContainerDied","Data":"46f523613f6ad97fcb8f0e5cea1226e695ac97ea7b1e9fdd76fe0ab440e4081b"} Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935346 4786 scope.go:117] "RemoveContainer" containerID="c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.935478 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.938121 4786 generic.go:334] "Generic (PLEG): container finished" podID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerID="ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8" exitCode=143 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.938193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a1967f-0614-4958-a74b-38b1ad0b1889","Type":"ContainerDied","Data":"ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8"} Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.942487 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" event={"ID":"49d674fa-8483-4cba-a0ad-49ebd1f68558","Type":"ContainerDied","Data":"520539bd6435fa02bba29e6f1959093b49ffba6d95cde39c250c0c020308b653"} Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.942518 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="520539bd6435fa02bba29e6f1959093b49ffba6d95cde39c250c0c020308b653" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.942608 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlf8" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.943916 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a4b50041-a1e3-47d1-903d-65a52e52dff2" containerName="nova-scheduler-scheduler" containerID="cri-o://d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112" gracePeriod=30 Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.957282 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.957334 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.957354 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2cbbd-f5cf-4ace-b384-94f994b05549-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.957372 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wt2s\" (UniqueName: \"kubernetes.io/projected/1aa2cbbd-f5cf-4ace-b384-94f994b05549-kube-api-access-6wt2s\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:47 crc kubenswrapper[4786]: I0313 12:11:47.982384 4786 scope.go:117] "RemoveContainer" containerID="95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.005371 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.039370 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.053089 4786 scope.go:117] "RemoveContainer" containerID="c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.061374 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc\": container with ID starting with c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc not found: ID does not exist" containerID="c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.061427 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc"} err="failed to get container status \"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc\": rpc error: code = NotFound desc = could not find container \"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc\": container with ID starting with c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc not found: ID does not exist" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.061459 4786 scope.go:117] "RemoveContainer" containerID="95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.062794 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744\": container with ID starting with 95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744 not found: ID does not exist" containerID="95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.062822 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744"} err="failed to get container status \"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744\": rpc error: code = NotFound desc = could not find container \"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744\": container with ID starting with 95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744 not found: ID does not exist" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.062841 4786 scope.go:117] "RemoveContainer" containerID="c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.067744 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.068237 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6990d3ed-4503-4d9c-9f56-7b21a9abb203" containerName="nova-manage" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068258 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6990d3ed-4503-4d9c-9f56-7b21a9abb203" containerName="nova-manage" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.068276 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-metadata" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068286 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-metadata" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.068298 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-log" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068305 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-log" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.068326 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerName="dnsmasq-dns" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068334 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerName="dnsmasq-dns" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.068347 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerName="init" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068354 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerName="init" Mar 13 12:11:48 crc kubenswrapper[4786]: E0313 12:11:48.068384 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d674fa-8483-4cba-a0ad-49ebd1f68558" containerName="nova-cell1-conductor-db-sync" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068393 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d674fa-8483-4cba-a0ad-49ebd1f68558" containerName="nova-cell1-conductor-db-sync" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068590 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6990d3ed-4503-4d9c-9f56-7b21a9abb203" containerName="nova-manage" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068607 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-metadata" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068620 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d674fa-8483-4cba-a0ad-49ebd1f68558" containerName="nova-cell1-conductor-db-sync" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068647 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="018af6b4-883b-4532-a65f-58f8b6e00b39" containerName="dnsmasq-dns" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.068661 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" containerName="nova-metadata-log" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.069392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.071528 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc"} err="failed to get container status \"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc\": rpc error: code = NotFound desc = could not find container \"c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc\": container with ID starting with c27354c3d9823562cb8577444d175b7d8ce47d3015f4c5c222edcfc326eceadc not found: ID does not exist" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.071747 4786 scope.go:117] "RemoveContainer" containerID="95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.076026 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744"} err="failed to get container status \"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744\": rpc error: code = NotFound desc = could not find container \"95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744\": container with ID starting with 95758a31f7c5364e94cd5663426de7e4e0c0a12590247e9086d7c0aabe3f8744 not found: ID does not exist" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.076360 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.098297 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.108916 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.110871 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.115500 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.115828 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.121028 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.161782 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.161832 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.162012 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.162049 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a28074-c7a5-4a91-880b-2e0a28bf0de5-logs\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.162142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcj2b\" (UniqueName: \"kubernetes.io/projected/b488d3ce-635a-4279-a05e-fba3b6599bda-kube-api-access-bcj2b\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.162172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.162197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-config-data\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.162324 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwcrp\" (UniqueName: \"kubernetes.io/projected/88a28074-c7a5-4a91-880b-2e0a28bf0de5-kube-api-access-vwcrp\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.263454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcj2b\" (UniqueName: \"kubernetes.io/projected/b488d3ce-635a-4279-a05e-fba3b6599bda-kube-api-access-bcj2b\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.263796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.263828 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-config-data\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.264114 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwcrp\" (UniqueName: \"kubernetes.io/projected/88a28074-c7a5-4a91-880b-2e0a28bf0de5-kube-api-access-vwcrp\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.264170 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.264190 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.264224 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.264243 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a28074-c7a5-4a91-880b-2e0a28bf0de5-logs\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.264709 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a28074-c7a5-4a91-880b-2e0a28bf0de5-logs\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.268090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.268759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.269039 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.275702 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.277756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-config-data\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.291519 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcj2b\" (UniqueName: \"kubernetes.io/projected/b488d3ce-635a-4279-a05e-fba3b6599bda-kube-api-access-bcj2b\") pod \"nova-cell1-conductor-0\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.298775 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwcrp\" (UniqueName: \"kubernetes.io/projected/88a28074-c7a5-4a91-880b-2e0a28bf0de5-kube-api-access-vwcrp\") pod \"nova-metadata-0\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.405965 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.432685 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.929812 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.946470 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.959411 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88a28074-c7a5-4a91-880b-2e0a28bf0de5","Type":"ContainerStarted","Data":"a41f81d93f9ee9ba1baff99643bdd6cc58b19825ee684844ccb30999badf5437"} Mar 13 12:11:48 crc kubenswrapper[4786]: I0313 12:11:48.960924 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b488d3ce-635a-4279-a05e-fba3b6599bda","Type":"ContainerStarted","Data":"cde4f39690d9f0257866837da4dfee74e2cf3f24aec6115a79c278edf3ba33bc"} Mar 13 12:11:49 crc kubenswrapper[4786]: I0313 12:11:49.462127 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa2cbbd-f5cf-4ace-b384-94f994b05549" path="/var/lib/kubelet/pods/1aa2cbbd-f5cf-4ace-b384-94f994b05549/volumes" Mar 13 12:11:49 crc kubenswrapper[4786]: I0313 12:11:49.982974 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88a28074-c7a5-4a91-880b-2e0a28bf0de5","Type":"ContainerStarted","Data":"9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24"} Mar 13 12:11:49 crc kubenswrapper[4786]: I0313 12:11:49.983323 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88a28074-c7a5-4a91-880b-2e0a28bf0de5","Type":"ContainerStarted","Data":"94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600"} Mar 13 12:11:49 crc kubenswrapper[4786]: I0313 12:11:49.985697 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b488d3ce-635a-4279-a05e-fba3b6599bda","Type":"ContainerStarted","Data":"4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c"} Mar 13 12:11:49 crc kubenswrapper[4786]: I0313 12:11:49.985872 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.005535 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.005441874 podStartE2EDuration="2.005441874s" podCreationTimestamp="2026-03-13 12:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:50.000527961 +0000 UTC m=+1497.280181428" watchObservedRunningTime="2026-03-13 12:11:50.005441874 +0000 UTC m=+1497.285095351" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.029910 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.029874257 podStartE2EDuration="3.029874257s" podCreationTimestamp="2026-03-13 12:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:50.017852201 +0000 UTC m=+1497.297505668" watchObservedRunningTime="2026-03-13 12:11:50.029874257 +0000 UTC m=+1497.309527704" Mar 13 12:11:50 crc kubenswrapper[4786]: E0313 12:11:50.241175 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:11:50 crc kubenswrapper[4786]: E0313 12:11:50.242973 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:11:50 crc kubenswrapper[4786]: E0313 12:11:50.244952 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:11:50 crc kubenswrapper[4786]: E0313 12:11:50.245005 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a4b50041-a1e3-47d1-903d-65a52e52dff2" containerName="nova-scheduler-scheduler" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.467518 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fh478"] Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.469992 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.479279 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fh478"] Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.617012 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j525q\" (UniqueName: \"kubernetes.io/projected/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-kube-api-access-j525q\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.617082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-utilities\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.618130 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-catalog-content\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.719839 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-catalog-content\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.720073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j525q\" (UniqueName: \"kubernetes.io/projected/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-kube-api-access-j525q\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.720121 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-utilities\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.720696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-utilities\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.721013 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-catalog-content\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.745487 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j525q\" (UniqueName: \"kubernetes.io/projected/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-kube-api-access-j525q\") pod \"community-operators-fh478\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:50 crc kubenswrapper[4786]: I0313 12:11:50.800145 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:11:51 crc kubenswrapper[4786]: I0313 12:11:51.365443 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fh478"] Mar 13 12:11:51 crc kubenswrapper[4786]: W0313 12:11:51.380129 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27124d0e_8b24_4aac_86fc_4c7ab1a3531a.slice/crio-76d4aa20b3d5c3b586ffe1c9493efb92b2832396888a84c1b82c88d13c923b83 WatchSource:0}: Error finding container 76d4aa20b3d5c3b586ffe1c9493efb92b2832396888a84c1b82c88d13c923b83: Status 404 returned error can't find the container with id 76d4aa20b3d5c3b586ffe1c9493efb92b2832396888a84c1b82c88d13c923b83 Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.019103 4786 generic.go:334] "Generic (PLEG): container finished" podID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerID="d3127b185b39c452f33ab78267d8b882fda4cf0a792c0c8a71ea9c249355d81a" exitCode=0 Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.019275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh478" event={"ID":"27124d0e-8b24-4aac-86fc-4c7ab1a3531a","Type":"ContainerDied","Data":"d3127b185b39c452f33ab78267d8b882fda4cf0a792c0c8a71ea9c249355d81a"} Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.019494 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh478" event={"ID":"27124d0e-8b24-4aac-86fc-4c7ab1a3531a","Type":"ContainerStarted","Data":"76d4aa20b3d5c3b586ffe1c9493efb92b2832396888a84c1b82c88d13c923b83"} Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.023194 4786 generic.go:334] "Generic (PLEG): container finished" podID="a4b50041-a1e3-47d1-903d-65a52e52dff2" containerID="d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112" exitCode=0 Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.023230 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4b50041-a1e3-47d1-903d-65a52e52dff2","Type":"ContainerDied","Data":"d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112"} Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.351784 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.453994 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-combined-ca-bundle\") pod \"a4b50041-a1e3-47d1-903d-65a52e52dff2\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.454113 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-config-data\") pod \"a4b50041-a1e3-47d1-903d-65a52e52dff2\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.454262 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5clzx\" (UniqueName: \"kubernetes.io/projected/a4b50041-a1e3-47d1-903d-65a52e52dff2-kube-api-access-5clzx\") pod \"a4b50041-a1e3-47d1-903d-65a52e52dff2\" (UID: \"a4b50041-a1e3-47d1-903d-65a52e52dff2\") " Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.461639 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b50041-a1e3-47d1-903d-65a52e52dff2-kube-api-access-5clzx" (OuterVolumeSpecName: "kube-api-access-5clzx") pod "a4b50041-a1e3-47d1-903d-65a52e52dff2" (UID: "a4b50041-a1e3-47d1-903d-65a52e52dff2"). InnerVolumeSpecName "kube-api-access-5clzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.487181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-config-data" (OuterVolumeSpecName: "config-data") pod "a4b50041-a1e3-47d1-903d-65a52e52dff2" (UID: "a4b50041-a1e3-47d1-903d-65a52e52dff2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.489061 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4b50041-a1e3-47d1-903d-65a52e52dff2" (UID: "a4b50041-a1e3-47d1-903d-65a52e52dff2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.557508 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5clzx\" (UniqueName: \"kubernetes.io/projected/a4b50041-a1e3-47d1-903d-65a52e52dff2-kube-api-access-5clzx\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.558302 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:52 crc kubenswrapper[4786]: I0313 12:11:52.558340 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b50041-a1e3-47d1-903d-65a52e52dff2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.034379 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.034371 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4b50041-a1e3-47d1-903d-65a52e52dff2","Type":"ContainerDied","Data":"1205455f9e6a3a6f70a747f1796eba44e99119ba73aa609c88bee62f97dc9470"} Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.035304 4786 scope.go:117] "RemoveContainer" containerID="d0e161fb00cbacbe1e0ca304dfe756f4c1a329aadbc3b529090a873221aba112" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.041111 4786 generic.go:334] "Generic (PLEG): container finished" podID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerID="6f64281da4aa8cf483b737b3b0a3c14552d8ab3d54b5e412e6af976479c43180" exitCode=0 Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.041161 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a1967f-0614-4958-a74b-38b1ad0b1889","Type":"ContainerDied","Data":"6f64281da4aa8cf483b737b3b0a3c14552d8ab3d54b5e412e6af976479c43180"} Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.041190 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a1967f-0614-4958-a74b-38b1ad0b1889","Type":"ContainerDied","Data":"371b8bb8b73cb028ef96b6b66ad97bb514c0b1b709084e331d4a67e9be90c084"} Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.041205 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="371b8bb8b73cb028ef96b6b66ad97bb514c0b1b709084e331d4a67e9be90c084" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.092155 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.112986 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.139716 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.149793 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:53 crc kubenswrapper[4786]: E0313 12:11:53.150164 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-log" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.150180 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-log" Mar 13 12:11:53 crc kubenswrapper[4786]: E0313 12:11:53.150206 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-api" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.150213 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-api" Mar 13 12:11:53 crc kubenswrapper[4786]: E0313 12:11:53.150227 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b50041-a1e3-47d1-903d-65a52e52dff2" containerName="nova-scheduler-scheduler" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.150234 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b50041-a1e3-47d1-903d-65a52e52dff2" containerName="nova-scheduler-scheduler" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.150454 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-api" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.150488 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b50041-a1e3-47d1-903d-65a52e52dff2" containerName="nova-scheduler-scheduler" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.150500 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" containerName="nova-api-log" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.151018 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.172119 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.182273 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.274328 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a1967f-0614-4958-a74b-38b1ad0b1889-logs\") pod \"10a1967f-0614-4958-a74b-38b1ad0b1889\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.274465 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-config-data\") pod \"10a1967f-0614-4958-a74b-38b1ad0b1889\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.274492 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-combined-ca-bundle\") pod \"10a1967f-0614-4958-a74b-38b1ad0b1889\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.274548 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p79h\" (UniqueName: \"kubernetes.io/projected/10a1967f-0614-4958-a74b-38b1ad0b1889-kube-api-access-9p79h\") pod \"10a1967f-0614-4958-a74b-38b1ad0b1889\" (UID: \"10a1967f-0614-4958-a74b-38b1ad0b1889\") " Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.274903 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-config-data\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.274932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.275001 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2m8s\" (UniqueName: \"kubernetes.io/projected/b964d018-9a2e-4174-996e-43d8f690752e-kube-api-access-m2m8s\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.275116 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10a1967f-0614-4958-a74b-38b1ad0b1889-logs" (OuterVolumeSpecName: "logs") pod "10a1967f-0614-4958-a74b-38b1ad0b1889" (UID: "10a1967f-0614-4958-a74b-38b1ad0b1889"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.290281 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a1967f-0614-4958-a74b-38b1ad0b1889-kube-api-access-9p79h" (OuterVolumeSpecName: "kube-api-access-9p79h") pod "10a1967f-0614-4958-a74b-38b1ad0b1889" (UID: "10a1967f-0614-4958-a74b-38b1ad0b1889"). InnerVolumeSpecName "kube-api-access-9p79h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.303661 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10a1967f-0614-4958-a74b-38b1ad0b1889" (UID: "10a1967f-0614-4958-a74b-38b1ad0b1889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.311480 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-config-data" (OuterVolumeSpecName: "config-data") pod "10a1967f-0614-4958-a74b-38b1ad0b1889" (UID: "10a1967f-0614-4958-a74b-38b1ad0b1889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.376373 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-config-data\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.376435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.376511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2m8s\" (UniqueName: \"kubernetes.io/projected/b964d018-9a2e-4174-996e-43d8f690752e-kube-api-access-m2m8s\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.377157 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.377302 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a1967f-0614-4958-a74b-38b1ad0b1889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.377322 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p79h\" (UniqueName: \"kubernetes.io/projected/10a1967f-0614-4958-a74b-38b1ad0b1889-kube-api-access-9p79h\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.377338 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a1967f-0614-4958-a74b-38b1ad0b1889-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.382064 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.390333 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.395496 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2m8s\" (UniqueName: \"kubernetes.io/projected/b964d018-9a2e-4174-996e-43d8f690752e-kube-api-access-m2m8s\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.401960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-config-data\") pod \"nova-scheduler-0\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.433129 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.433176 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.455498 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b50041-a1e3-47d1-903d-65a52e52dff2" path="/var/lib/kubelet/pods/a4b50041-a1e3-47d1-903d-65a52e52dff2/volumes" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.504713 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:11:53 crc kubenswrapper[4786]: I0313 12:11:53.811781 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.053250 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b964d018-9a2e-4174-996e-43d8f690752e","Type":"ContainerStarted","Data":"0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34"} Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.053313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b964d018-9a2e-4174-996e-43d8f690752e","Type":"ContainerStarted","Data":"a4c44926c573169c92a0635e0bc2f5df32c2fbe6bb27d7fe2aa762a0ea94cacf"} Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.055833 4786 generic.go:334] "Generic (PLEG): container finished" podID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerID="e6a65578a92790a911098a672c1d67fd4e80922281eff90b52f359eb042acc5d" exitCode=0 Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.055918 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh478" event={"ID":"27124d0e-8b24-4aac-86fc-4c7ab1a3531a","Type":"ContainerDied","Data":"e6a65578a92790a911098a672c1d67fd4e80922281eff90b52f359eb042acc5d"} Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.056038 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.087970 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.087948577 podStartE2EDuration="1.087948577s" podCreationTimestamp="2026-03-13 12:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:54.07144466 +0000 UTC m=+1501.351098157" watchObservedRunningTime="2026-03-13 12:11:54.087948577 +0000 UTC m=+1501.367602054" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.112740 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.125075 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.156177 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.157550 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.159914 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.169670 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.193010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-config-data\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.193127 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqz24\" (UniqueName: \"kubernetes.io/projected/b970b959-5fc2-4486-a6af-f931f04f0eb0-kube-api-access-mqz24\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.193153 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.193247 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b970b959-5fc2-4486-a6af-f931f04f0eb0-logs\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.295983 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b970b959-5fc2-4486-a6af-f931f04f0eb0-logs\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.296170 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-config-data\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.296257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqz24\" (UniqueName: \"kubernetes.io/projected/b970b959-5fc2-4486-a6af-f931f04f0eb0-kube-api-access-mqz24\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.296281 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.296499 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b970b959-5fc2-4486-a6af-f931f04f0eb0-logs\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.302945 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.303525 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-config-data\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.311695 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqz24\" (UniqueName: \"kubernetes.io/projected/b970b959-5fc2-4486-a6af-f931f04f0eb0-kube-api-access-mqz24\") pod \"nova-api-0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.480765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:11:54 crc kubenswrapper[4786]: I0313 12:11:54.997499 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:11:55 crc kubenswrapper[4786]: I0313 12:11:55.068119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b970b959-5fc2-4486-a6af-f931f04f0eb0","Type":"ContainerStarted","Data":"407a42e7638f54df847f414745f1fbecd3c06f313ebf055a555e09598c660e22"} Mar 13 12:11:55 crc kubenswrapper[4786]: I0313 12:11:55.071143 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh478" event={"ID":"27124d0e-8b24-4aac-86fc-4c7ab1a3531a","Type":"ContainerStarted","Data":"060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa"} Mar 13 12:11:55 crc kubenswrapper[4786]: I0313 12:11:55.461029 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a1967f-0614-4958-a74b-38b1ad0b1889" path="/var/lib/kubelet/pods/10a1967f-0614-4958-a74b-38b1ad0b1889/volumes" Mar 13 12:11:56 crc kubenswrapper[4786]: I0313 12:11:56.081189 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b970b959-5fc2-4486-a6af-f931f04f0eb0","Type":"ContainerStarted","Data":"33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985"} Mar 13 12:11:56 crc kubenswrapper[4786]: I0313 12:11:56.081599 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b970b959-5fc2-4486-a6af-f931f04f0eb0","Type":"ContainerStarted","Data":"afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676"} Mar 13 12:11:56 crc kubenswrapper[4786]: I0313 12:11:56.105732 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fh478" podStartSLOduration=3.517213042 podStartE2EDuration="6.105710847s" podCreationTimestamp="2026-03-13 12:11:50 +0000 UTC" firstStartedPulling="2026-03-13 12:11:52.020595563 +0000 UTC m=+1499.300249010" lastFinishedPulling="2026-03-13 12:11:54.609093358 +0000 UTC m=+1501.888746815" observedRunningTime="2026-03-13 12:11:55.090150001 +0000 UTC m=+1502.369803508" watchObservedRunningTime="2026-03-13 12:11:56.105710847 +0000 UTC m=+1503.385364304" Mar 13 12:11:56 crc kubenswrapper[4786]: I0313 12:11:56.113634 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.113615761 podStartE2EDuration="2.113615761s" podCreationTimestamp="2026-03-13 12:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:11:56.101589445 +0000 UTC m=+1503.381242892" watchObservedRunningTime="2026-03-13 12:11:56.113615761 +0000 UTC m=+1503.393269208" Mar 13 12:11:56 crc kubenswrapper[4786]: I0313 12:11:56.918328 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 12:11:58 crc kubenswrapper[4786]: I0313 12:11:58.433444 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:11:58 crc kubenswrapper[4786]: I0313 12:11:58.433510 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:11:58 crc kubenswrapper[4786]: I0313 12:11:58.435039 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 12:11:58 crc kubenswrapper[4786]: I0313 12:11:58.505746 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.452319 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.452324 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.679963 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-62cxs"] Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.702430 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.706570 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-62cxs"] Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.711898 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-catalog-content\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.712129 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgm7m\" (UniqueName: \"kubernetes.io/projected/b8bad192-9467-4657-b343-1fa02b6f2702-kube-api-access-qgm7m\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.712208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-utilities\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.813118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-catalog-content\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.813520 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgm7m\" (UniqueName: \"kubernetes.io/projected/b8bad192-9467-4657-b343-1fa02b6f2702-kube-api-access-qgm7m\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.813665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-utilities\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.813692 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-catalog-content\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.813966 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-utilities\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:11:59 crc kubenswrapper[4786]: I0313 12:11:59.835724 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgm7m\" (UniqueName: \"kubernetes.io/projected/b8bad192-9467-4657-b343-1fa02b6f2702-kube-api-access-qgm7m\") pod \"redhat-operators-62cxs\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.020794 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.145017 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556732-fkzcg"] Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.146552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.156316 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-fkzcg"] Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.168498 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.168690 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.168812 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.219984 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkhx\" (UniqueName: \"kubernetes.io/projected/f6e83607-3ddd-4f8f-885d-b723affa2133-kube-api-access-5bkhx\") pod \"auto-csr-approver-29556732-fkzcg\" (UID: \"f6e83607-3ddd-4f8f-885d-b723affa2133\") " pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.322289 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkhx\" (UniqueName: \"kubernetes.io/projected/f6e83607-3ddd-4f8f-885d-b723affa2133-kube-api-access-5bkhx\") pod \"auto-csr-approver-29556732-fkzcg\" (UID: \"f6e83607-3ddd-4f8f-885d-b723affa2133\") " pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.352707 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkhx\" (UniqueName: \"kubernetes.io/projected/f6e83607-3ddd-4f8f-885d-b723affa2133-kube-api-access-5bkhx\") pod \"auto-csr-approver-29556732-fkzcg\" (UID: \"f6e83607-3ddd-4f8f-885d-b723affa2133\") " pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.493181 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:00 crc kubenswrapper[4786]: I0313 12:12:00.617714 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-62cxs"] Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:00.802369 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:00.802411 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:00.860958 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:00.984870 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-fkzcg"] Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.130976 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" event={"ID":"f6e83607-3ddd-4f8f-885d-b723affa2133","Type":"ContainerStarted","Data":"6efc90f188365ee08caab3657494c30d40b0c9115416db771a595ba2ace19feb"} Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.132135 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8bad192-9467-4657-b343-1fa02b6f2702" containerID="4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293" exitCode=0 Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.132255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerDied","Data":"4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293"} Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.132313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerStarted","Data":"552609aa275fb8c509e8446bb25f62673a8f6c211476b9d2610f037212bdc8b3"} Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.202903 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.729800 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:12:01 crc kubenswrapper[4786]: I0313 12:12:01.730008 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="befbdb06-def2-49f6-83c8-c3a84dd09334" containerName="kube-state-metrics" containerID="cri-o://f8f7253cf1f9e788176a413b643a024cd5a1da997bb15d8a92fc01baed2f8c02" gracePeriod=30 Mar 13 12:12:02 crc kubenswrapper[4786]: I0313 12:12:02.142458 4786 generic.go:334] "Generic (PLEG): container finished" podID="befbdb06-def2-49f6-83c8-c3a84dd09334" containerID="f8f7253cf1f9e788176a413b643a024cd5a1da997bb15d8a92fc01baed2f8c02" exitCode=2 Mar 13 12:12:02 crc kubenswrapper[4786]: I0313 12:12:02.142718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"befbdb06-def2-49f6-83c8-c3a84dd09334","Type":"ContainerDied","Data":"f8f7253cf1f9e788176a413b643a024cd5a1da997bb15d8a92fc01baed2f8c02"} Mar 13 12:12:02 crc kubenswrapper[4786]: I0313 12:12:02.322672 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:12:02 crc kubenswrapper[4786]: I0313 12:12:02.469534 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2rt\" (UniqueName: \"kubernetes.io/projected/befbdb06-def2-49f6-83c8-c3a84dd09334-kube-api-access-lt2rt\") pod \"befbdb06-def2-49f6-83c8-c3a84dd09334\" (UID: \"befbdb06-def2-49f6-83c8-c3a84dd09334\") " Mar 13 12:12:02 crc kubenswrapper[4786]: I0313 12:12:02.476389 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befbdb06-def2-49f6-83c8-c3a84dd09334-kube-api-access-lt2rt" (OuterVolumeSpecName: "kube-api-access-lt2rt") pod "befbdb06-def2-49f6-83c8-c3a84dd09334" (UID: "befbdb06-def2-49f6-83c8-c3a84dd09334"). InnerVolumeSpecName "kube-api-access-lt2rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:02 crc kubenswrapper[4786]: I0313 12:12:02.571795 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2rt\" (UniqueName: \"kubernetes.io/projected/befbdb06-def2-49f6-83c8-c3a84dd09334-kube-api-access-lt2rt\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.158802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"befbdb06-def2-49f6-83c8-c3a84dd09334","Type":"ContainerDied","Data":"86dd67ae5f5f05a266da244184a9dfebc8ad3556c3723bc5a97c5eeadc46bd53"} Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.159950 4786 scope.go:117] "RemoveContainer" containerID="f8f7253cf1f9e788176a413b643a024cd5a1da997bb15d8a92fc01baed2f8c02" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.159234 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.165076 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerStarted","Data":"81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4"} Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.212593 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.233962 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.240792 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:12:03 crc kubenswrapper[4786]: E0313 12:12:03.241358 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befbdb06-def2-49f6-83c8-c3a84dd09334" containerName="kube-state-metrics" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.241382 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="befbdb06-def2-49f6-83c8-c3a84dd09334" containerName="kube-state-metrics" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.241654 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="befbdb06-def2-49f6-83c8-c3a84dd09334" containerName="kube-state-metrics" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.242535 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.246094 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.246356 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.252066 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fh478"] Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.252799 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fh478" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="registry-server" containerID="cri-o://060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa" gracePeriod=2 Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.264739 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.387913 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.387995 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.388119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7d6q\" (UniqueName: \"kubernetes.io/projected/39720781-e027-4319-9c8f-1d9134d269f8-kube-api-access-g7d6q\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.388312 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.452903 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befbdb06-def2-49f6-83c8-c3a84dd09334" path="/var/lib/kubelet/pods/befbdb06-def2-49f6-83c8-c3a84dd09334/volumes" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.489982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7d6q\" (UniqueName: \"kubernetes.io/projected/39720781-e027-4319-9c8f-1d9134d269f8-kube-api-access-g7d6q\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.490110 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.490337 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.490437 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.502923 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.505363 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.505533 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.505582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.511473 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7d6q\" (UniqueName: \"kubernetes.io/projected/39720781-e027-4319-9c8f-1d9134d269f8-kube-api-access-g7d6q\") pod \"kube-state-metrics-0\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.535044 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.535307 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-central-agent" containerID="cri-o://79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887" gracePeriod=30 Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.535691 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="proxy-httpd" containerID="cri-o://494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c" gracePeriod=30 Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.535741 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="sg-core" containerID="cri-o://d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a" gracePeriod=30 Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.535772 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-notification-agent" containerID="cri-o://17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7" gracePeriod=30 Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.566610 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:12:03 crc kubenswrapper[4786]: I0313 12:12:03.582672 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.022500 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.231107 4786 generic.go:334] "Generic (PLEG): container finished" podID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerID="494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c" exitCode=0 Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.231367 4786 generic.go:334] "Generic (PLEG): container finished" podID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerID="d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a" exitCode=2 Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.231378 4786 generic.go:334] "Generic (PLEG): container finished" podID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerID="79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887" exitCode=0 Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.231424 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerDied","Data":"494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c"} Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.231453 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerDied","Data":"d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a"} Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.231466 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerDied","Data":"79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887"} Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.236566 4786 generic.go:334] "Generic (PLEG): container finished" podID="f6e83607-3ddd-4f8f-885d-b723affa2133" containerID="c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6" exitCode=0 Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.236620 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" event={"ID":"f6e83607-3ddd-4f8f-885d-b723affa2133","Type":"ContainerDied","Data":"c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6"} Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.246514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39720781-e027-4319-9c8f-1d9134d269f8","Type":"ContainerStarted","Data":"2deaa1c11423b0839d0708a1e1d23621f889f01543843f5d7a0d106d21d28ab6"} Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.341111 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.481903 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:12:04 crc kubenswrapper[4786]: I0313 12:12:04.481983 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.266739 4786 generic.go:334] "Generic (PLEG): container finished" podID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerID="060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa" exitCode=0 Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.266821 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh478" event={"ID":"27124d0e-8b24-4aac-86fc-4c7ab1a3531a","Type":"ContainerDied","Data":"060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa"} Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.269000 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8bad192-9467-4657-b343-1fa02b6f2702" containerID="81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4" exitCode=0 Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.269103 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerDied","Data":"81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4"} Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.565058 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.565323 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.895127 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.942800 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bkhx\" (UniqueName: \"kubernetes.io/projected/f6e83607-3ddd-4f8f-885d-b723affa2133-kube-api-access-5bkhx\") pod \"f6e83607-3ddd-4f8f-885d-b723affa2133\" (UID: \"f6e83607-3ddd-4f8f-885d-b723affa2133\") " Mar 13 12:12:05 crc kubenswrapper[4786]: I0313 12:12:05.956363 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e83607-3ddd-4f8f-885d-b723affa2133-kube-api-access-5bkhx" (OuterVolumeSpecName: "kube-api-access-5bkhx") pod "f6e83607-3ddd-4f8f-885d-b723affa2133" (UID: "f6e83607-3ddd-4f8f-885d-b723affa2133"). InnerVolumeSpecName "kube-api-access-5bkhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.049030 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bkhx\" (UniqueName: \"kubernetes.io/projected/f6e83607-3ddd-4f8f-885d-b723affa2133-kube-api-access-5bkhx\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.169799 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.192732 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252530 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-config-data\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252619 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j525q\" (UniqueName: \"kubernetes.io/projected/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-kube-api-access-j525q\") pod \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252646 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-catalog-content\") pod \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252677 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-run-httpd\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252753 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-sg-core-conf-yaml\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252835 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-scripts\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252859 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-combined-ca-bundle\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.252964 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqmqj\" (UniqueName: \"kubernetes.io/projected/5b8e8868-a319-487c-b1d8-8070f652b3cf-kube-api-access-qqmqj\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.253005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-utilities\") pod \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\" (UID: \"27124d0e-8b24-4aac-86fc-4c7ab1a3531a\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.253041 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-log-httpd\") pod \"5b8e8868-a319-487c-b1d8-8070f652b3cf\" (UID: \"5b8e8868-a319-487c-b1d8-8070f652b3cf\") " Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.253788 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-utilities" (OuterVolumeSpecName: "utilities") pod "27124d0e-8b24-4aac-86fc-4c7ab1a3531a" (UID: "27124d0e-8b24-4aac-86fc-4c7ab1a3531a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.253915 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.255332 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.264048 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8e8868-a319-487c-b1d8-8070f652b3cf-kube-api-access-qqmqj" (OuterVolumeSpecName: "kube-api-access-qqmqj") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "kube-api-access-qqmqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.264131 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-kube-api-access-j525q" (OuterVolumeSpecName: "kube-api-access-j525q") pod "27124d0e-8b24-4aac-86fc-4c7ab1a3531a" (UID: "27124d0e-8b24-4aac-86fc-4c7ab1a3531a"). InnerVolumeSpecName "kube-api-access-j525q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.275152 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-scripts" (OuterVolumeSpecName: "scripts") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.290482 4786 generic.go:334] "Generic (PLEG): container finished" podID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerID="17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7" exitCode=0 Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.290631 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.290658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerDied","Data":"17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7"} Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.291506 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b8e8868-a319-487c-b1d8-8070f652b3cf","Type":"ContainerDied","Data":"7ff2e7a779513c5b7af110ede2737e06fd673ec7a644d04884b3e141d8f2866e"} Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.291526 4786 scope.go:117] "RemoveContainer" containerID="494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.293104 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" event={"ID":"f6e83607-3ddd-4f8f-885d-b723affa2133","Type":"ContainerDied","Data":"6efc90f188365ee08caab3657494c30d40b0c9115416db771a595ba2ace19feb"} Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.293144 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6efc90f188365ee08caab3657494c30d40b0c9115416db771a595ba2ace19feb" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.293196 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-fkzcg" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.302966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39720781-e027-4319-9c8f-1d9134d269f8","Type":"ContainerStarted","Data":"e54841fed760ec7f6d2745e7319245bad6d4e8f266b28143dbe25cdfa3e60e17"} Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.303333 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.309513 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh478" event={"ID":"27124d0e-8b24-4aac-86fc-4c7ab1a3531a","Type":"ContainerDied","Data":"76d4aa20b3d5c3b586ffe1c9493efb92b2832396888a84c1b82c88d13c923b83"} Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.309777 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh478" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.312005 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.328503 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27124d0e-8b24-4aac-86fc-4c7ab1a3531a" (UID: "27124d0e-8b24-4aac-86fc-4c7ab1a3531a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.330264 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.932405641 podStartE2EDuration="3.330242506s" podCreationTimestamp="2026-03-13 12:12:03 +0000 UTC" firstStartedPulling="2026-03-13 12:12:04.028269485 +0000 UTC m=+1511.307922932" lastFinishedPulling="2026-03-13 12:12:04.42610635 +0000 UTC m=+1511.705759797" observedRunningTime="2026-03-13 12:12:06.327747098 +0000 UTC m=+1513.607400565" watchObservedRunningTime="2026-03-13 12:12:06.330242506 +0000 UTC m=+1513.609895973" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.360158 4786 scope.go:117] "RemoveContainer" containerID="d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363023 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363133 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363202 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqmqj\" (UniqueName: \"kubernetes.io/projected/5b8e8868-a319-487c-b1d8-8070f652b3cf-kube-api-access-qqmqj\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363270 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363331 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363413 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j525q\" (UniqueName: \"kubernetes.io/projected/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-kube-api-access-j525q\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363504 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27124d0e-8b24-4aac-86fc-4c7ab1a3531a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.363592 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b8e8868-a319-487c-b1d8-8070f652b3cf-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.373105 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.389477 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-config-data" (OuterVolumeSpecName: "config-data") pod "5b8e8868-a319-487c-b1d8-8070f652b3cf" (UID: "5b8e8868-a319-487c-b1d8-8070f652b3cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.392741 4786 scope.go:117] "RemoveContainer" containerID="17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.418493 4786 scope.go:117] "RemoveContainer" containerID="79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.443714 4786 scope.go:117] "RemoveContainer" containerID="494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.444133 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c\": container with ID starting with 494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c not found: ID does not exist" containerID="494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.444162 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c"} err="failed to get container status \"494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c\": rpc error: code = NotFound desc = could not find container \"494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c\": container with ID starting with 494a7a9c6ac3a7f14a79e575ccbe2a02ade8e1b5dda6c86cd703d7628422a91c not found: ID does not exist" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.444181 4786 scope.go:117] "RemoveContainer" containerID="d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.444455 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a\": container with ID starting with d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a not found: ID does not exist" containerID="d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.444478 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a"} err="failed to get container status \"d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a\": rpc error: code = NotFound desc = could not find container \"d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a\": container with ID starting with d74b645b6a4aca36bac09700f4a71b96147d3bba8523f2a9f5a60d9bb004215a not found: ID does not exist" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.444495 4786 scope.go:117] "RemoveContainer" containerID="17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.444725 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7\": container with ID starting with 17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7 not found: ID does not exist" containerID="17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.444746 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7"} err="failed to get container status \"17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7\": rpc error: code = NotFound desc = could not find container \"17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7\": container with ID starting with 17959c6bd7d5a8cc1ac334545875c9f231c3c6c4129e5409ba818289667430b7 not found: ID does not exist" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.444777 4786 scope.go:117] "RemoveContainer" containerID="79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.445015 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887\": container with ID starting with 79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887 not found: ID does not exist" containerID="79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.445037 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887"} err="failed to get container status \"79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887\": rpc error: code = NotFound desc = could not find container \"79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887\": container with ID starting with 79fc93a27449efc1691fd113eda04486eebf7c31d499d1158246d347e1b74887 not found: ID does not exist" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.445052 4786 scope.go:117] "RemoveContainer" containerID="060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.464676 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.464700 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e8868-a319-487c-b1d8-8070f652b3cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.473507 4786 scope.go:117] "RemoveContainer" containerID="e6a65578a92790a911098a672c1d67fd4e80922281eff90b52f359eb042acc5d" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.497005 4786 scope.go:117] "RemoveContainer" containerID="d3127b185b39c452f33ab78267d8b882fda4cf0a792c0c8a71ea9c249355d81a" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.627509 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.635461 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.658827 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659436 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="extract-content" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659462 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="extract-content" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659496 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-notification-agent" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659507 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-notification-agent" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659531 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="sg-core" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659542 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="sg-core" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659556 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="registry-server" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659564 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="registry-server" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659587 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-central-agent" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659597 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-central-agent" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659622 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e83607-3ddd-4f8f-885d-b723affa2133" containerName="oc" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659631 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e83607-3ddd-4f8f-885d-b723affa2133" containerName="oc" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659652 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="proxy-httpd" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659662 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="proxy-httpd" Mar 13 12:12:06 crc kubenswrapper[4786]: E0313 12:12:06.659688 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="extract-utilities" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.659699 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="extract-utilities" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.660022 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-notification-agent" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.660052 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="proxy-httpd" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.660071 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="sg-core" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.660089 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" containerName="registry-server" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.660109 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" containerName="ceilometer-central-agent" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.660126 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e83607-3ddd-4f8f-885d-b723affa2133" containerName="oc" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.662964 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.665017 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.665721 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.666796 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.668365 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fh478"] Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.678324 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fh478"] Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.688038 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769517 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769635 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnzr\" (UniqueName: \"kubernetes.io/projected/7b9ce452-adfd-4058-afdc-6dd53158cb93-kube-api-access-mbnzr\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769698 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-scripts\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769803 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-log-httpd\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769826 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-config-data\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.769841 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-run-httpd\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.870956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-log-httpd\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.870996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-config-data\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871014 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-run-httpd\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871048 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871065 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871113 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871153 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbnzr\" (UniqueName: \"kubernetes.io/projected/7b9ce452-adfd-4058-afdc-6dd53158cb93-kube-api-access-mbnzr\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871186 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-scripts\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.871791 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-log-httpd\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.872146 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-run-httpd\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.876650 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.877867 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.878291 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-config-data\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.878605 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.885771 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-scripts\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.887597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbnzr\" (UniqueName: \"kubernetes.io/projected/7b9ce452-adfd-4058-afdc-6dd53158cb93-kube-api-access-mbnzr\") pod \"ceilometer-0\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " pod="openstack/ceilometer-0" Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.984286 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-9kvnq"] Mar 13 12:12:06 crc kubenswrapper[4786]: I0313 12:12:06.994087 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-9kvnq"] Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.022746 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.352071 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerStarted","Data":"662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae"} Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.457348 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27124d0e-8b24-4aac-86fc-4c7ab1a3531a" path="/var/lib/kubelet/pods/27124d0e-8b24-4aac-86fc-4c7ab1a3531a/volumes" Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.464052 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8e8868-a319-487c-b1d8-8070f652b3cf" path="/var/lib/kubelet/pods/5b8e8868-a319-487c-b1d8-8070f652b3cf/volumes" Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.465476 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb268aac-2924-44f6-9f5a-1cd5a3c770a6" path="/var/lib/kubelet/pods/cb268aac-2924-44f6-9f5a-1cd5a3c770a6/volumes" Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.470637 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-62cxs" podStartSLOduration=3.275797354 podStartE2EDuration="8.470619109s" podCreationTimestamp="2026-03-13 12:11:59 +0000 UTC" firstStartedPulling="2026-03-13 12:12:01.133434457 +0000 UTC m=+1508.413087904" lastFinishedPulling="2026-03-13 12:12:06.328256212 +0000 UTC m=+1513.607909659" observedRunningTime="2026-03-13 12:12:07.373045691 +0000 UTC m=+1514.652699168" watchObservedRunningTime="2026-03-13 12:12:07.470619109 +0000 UTC m=+1514.750272556" Mar 13 12:12:07 crc kubenswrapper[4786]: I0313 12:12:07.471332 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:08 crc kubenswrapper[4786]: I0313 12:12:08.363436 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerStarted","Data":"aa0e470bda35a03af8c66fee311a44ee6d872ab12f8e892d99037221a796c9b2"} Mar 13 12:12:08 crc kubenswrapper[4786]: I0313 12:12:08.363957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerStarted","Data":"f27c1e2473b56cfe31f05c9f69acaf2bb5ab6dd443a461bf2b99af7528b8b17c"} Mar 13 12:12:08 crc kubenswrapper[4786]: I0313 12:12:08.440826 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:12:08 crc kubenswrapper[4786]: I0313 12:12:08.441281 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:12:08 crc kubenswrapper[4786]: I0313 12:12:08.448074 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:12:09 crc kubenswrapper[4786]: I0313 12:12:09.374800 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerStarted","Data":"ff58b2de16ed9a51881f71d86c96837081f8fcc0dbee6c16cf7ba252861c314e"} Mar 13 12:12:09 crc kubenswrapper[4786]: I0313 12:12:09.383951 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:12:10 crc kubenswrapper[4786]: I0313 12:12:10.021854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:10 crc kubenswrapper[4786]: I0313 12:12:10.022185 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:10 crc kubenswrapper[4786]: I0313 12:12:10.397657 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerStarted","Data":"8e3087f5d62be2a686205362bb92d0458ee30cacc59008a101d71f907f7aadcc"} Mar 13 12:12:10 crc kubenswrapper[4786]: W0313 12:12:10.845538 4786 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e83607_3ddd_4f8f_885d_b723affa2133.slice/crio-6efc90f188365ee08caab3657494c30d40b0c9115416db771a595ba2ace19feb": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e83607_3ddd_4f8f_885d_b723affa2133.slice/crio-6efc90f188365ee08caab3657494c30d40b0c9115416db771a595ba2ace19feb: no such file or directory Mar 13 12:12:10 crc kubenswrapper[4786]: W0313 12:12:10.847988 4786 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bad192_9467_4657_b343_1fa02b6f2702.slice/crio-conmon-81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bad192_9467_4657_b343_1fa02b6f2702.slice/crio-conmon-81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4.scope: no such file or directory Mar 13 12:12:10 crc kubenswrapper[4786]: W0313 12:12:10.848053 4786 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bad192_9467_4657_b343_1fa02b6f2702.slice/crio-81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bad192_9467_4657_b343_1fa02b6f2702.slice/crio-81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4.scope: no such file or directory Mar 13 12:12:10 crc kubenswrapper[4786]: W0313 12:12:10.848076 4786 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e83607_3ddd_4f8f_885d_b723affa2133.slice/crio-conmon-c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e83607_3ddd_4f8f_885d_b723affa2133.slice/crio-conmon-c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6.scope: no such file or directory Mar 13 12:12:10 crc kubenswrapper[4786]: W0313 12:12:10.848091 4786 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e83607_3ddd_4f8f_885d_b723affa2133.slice/crio-c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e83607_3ddd_4f8f_885d_b723affa2133.slice/crio-c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6.scope: no such file or directory Mar 13 12:12:10 crc kubenswrapper[4786]: W0313 12:12:10.849245 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27124d0e_8b24_4aac_86fc_4c7ab1a3531a.slice/crio-060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa.scope WatchSource:0}: Error finding container 060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa: Status 404 returned error can't find the container with id 060bb336db765c93dd9c2cf7c1d9d5354851a4dddc1d5d05002e48ee62170aaa Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.077085 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-62cxs" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="registry-server" probeResult="failure" output=< Mar 13 12:12:11 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:12:11 crc kubenswrapper[4786]: > Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.247274 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.357094 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr7mz\" (UniqueName: \"kubernetes.io/projected/d84e3cb0-36a5-411a-9463-c9237f1eb943-kube-api-access-tr7mz\") pod \"d84e3cb0-36a5-411a-9463-c9237f1eb943\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.357433 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-combined-ca-bundle\") pod \"d84e3cb0-36a5-411a-9463-c9237f1eb943\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.357518 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-config-data\") pod \"d84e3cb0-36a5-411a-9463-c9237f1eb943\" (UID: \"d84e3cb0-36a5-411a-9463-c9237f1eb943\") " Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.362032 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84e3cb0-36a5-411a-9463-c9237f1eb943-kube-api-access-tr7mz" (OuterVolumeSpecName: "kube-api-access-tr7mz") pod "d84e3cb0-36a5-411a-9463-c9237f1eb943" (UID: "d84e3cb0-36a5-411a-9463-c9237f1eb943"). InnerVolumeSpecName "kube-api-access-tr7mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.394186 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-config-data" (OuterVolumeSpecName: "config-data") pod "d84e3cb0-36a5-411a-9463-c9237f1eb943" (UID: "d84e3cb0-36a5-411a-9463-c9237f1eb943"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.410485 4786 generic.go:334] "Generic (PLEG): container finished" podID="d84e3cb0-36a5-411a-9463-c9237f1eb943" containerID="b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4" exitCode=137 Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.410531 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.410558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d84e3cb0-36a5-411a-9463-c9237f1eb943","Type":"ContainerDied","Data":"b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4"} Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.410635 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d84e3cb0-36a5-411a-9463-c9237f1eb943","Type":"ContainerDied","Data":"cdd4fc5c5151c78f26a5aebab2f1099029f9c3d7496855714957956ff00d005f"} Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.410660 4786 scope.go:117] "RemoveContainer" containerID="b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.422094 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d84e3cb0-36a5-411a-9463-c9237f1eb943" (UID: "d84e3cb0-36a5-411a-9463-c9237f1eb943"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.460449 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.460481 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr7mz\" (UniqueName: \"kubernetes.io/projected/d84e3cb0-36a5-411a-9463-c9237f1eb943-kube-api-access-tr7mz\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.460494 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d84e3cb0-36a5-411a-9463-c9237f1eb943-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.463224 4786 scope.go:117] "RemoveContainer" containerID="b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4" Mar 13 12:12:11 crc kubenswrapper[4786]: E0313 12:12:11.466126 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4\": container with ID starting with b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4 not found: ID does not exist" containerID="b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.466179 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4"} err="failed to get container status \"b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4\": rpc error: code = NotFound desc = could not find container \"b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4\": container with ID starting with b09fd7f36fa30e6fd3b7b1ead04616729f5ae452b6c0a4d28643a7ede408bfa4 not found: ID does not exist" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.730523 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.741080 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.757061 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:12:11 crc kubenswrapper[4786]: E0313 12:12:11.757545 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84e3cb0-36a5-411a-9463-c9237f1eb943" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.757562 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84e3cb0-36a5-411a-9463-c9237f1eb943" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.757755 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d84e3cb0-36a5-411a-9463-c9237f1eb943" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.759206 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.762941 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.762974 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.763118 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.769979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.867554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.867836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xhrc\" (UniqueName: \"kubernetes.io/projected/cbebf1d2-7723-4d09-85de-a7e630caad3b-kube-api-access-4xhrc\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.868022 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.868186 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.868285 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.969492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.969595 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.969626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.969678 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.969795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xhrc\" (UniqueName: \"kubernetes.io/projected/cbebf1d2-7723-4d09-85de-a7e630caad3b-kube-api-access-4xhrc\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.975444 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.975988 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.978189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.983969 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:11 crc kubenswrapper[4786]: I0313 12:12:11.989754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xhrc\" (UniqueName: \"kubernetes.io/projected/cbebf1d2-7723-4d09-85de-a7e630caad3b-kube-api-access-4xhrc\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:12 crc kubenswrapper[4786]: I0313 12:12:12.087726 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:12 crc kubenswrapper[4786]: I0313 12:12:12.424018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerStarted","Data":"54adeada9579b0b6b36af08b39ac2348320ad9ad09b29d79ce79bbcda106787f"} Mar 13 12:12:12 crc kubenswrapper[4786]: I0313 12:12:12.424375 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:12:12 crc kubenswrapper[4786]: I0313 12:12:12.465508 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.637323926 podStartE2EDuration="6.465489278s" podCreationTimestamp="2026-03-13 12:12:06 +0000 UTC" firstStartedPulling="2026-03-13 12:12:07.480799305 +0000 UTC m=+1514.760452752" lastFinishedPulling="2026-03-13 12:12:11.308964647 +0000 UTC m=+1518.588618104" observedRunningTime="2026-03-13 12:12:12.45230439 +0000 UTC m=+1519.731957857" watchObservedRunningTime="2026-03-13 12:12:12.465489278 +0000 UTC m=+1519.745142725" Mar 13 12:12:12 crc kubenswrapper[4786]: I0313 12:12:12.691662 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:12:12 crc kubenswrapper[4786]: W0313 12:12:12.694479 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbebf1d2_7723_4d09_85de_a7e630caad3b.slice/crio-0375897e15c4dd7ecbe37dd8c4b9b34d616b183dfdff1667d0aa74cb777bce21 WatchSource:0}: Error finding container 0375897e15c4dd7ecbe37dd8c4b9b34d616b183dfdff1667d0aa74cb777bce21: Status 404 returned error can't find the container with id 0375897e15c4dd7ecbe37dd8c4b9b34d616b183dfdff1667d0aa74cb777bce21 Mar 13 12:12:13 crc kubenswrapper[4786]: I0313 12:12:13.323233 4786 scope.go:117] "RemoveContainer" containerID="9f7c10ddaa42f6b54b6af8e66973ed9e444fdeff3412fa2305b7b72c37c849f7" Mar 13 12:12:13 crc kubenswrapper[4786]: I0313 12:12:13.491442 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84e3cb0-36a5-411a-9463-c9237f1eb943" path="/var/lib/kubelet/pods/d84e3cb0-36a5-411a-9463-c9237f1eb943/volumes" Mar 13 12:12:13 crc kubenswrapper[4786]: I0313 12:12:13.492260 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cbebf1d2-7723-4d09-85de-a7e630caad3b","Type":"ContainerStarted","Data":"f743b475f641e7e3d24360e430567f8d2e636c83bf391096a2497df4fab645b6"} Mar 13 12:12:13 crc kubenswrapper[4786]: I0313 12:12:13.492295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cbebf1d2-7723-4d09-85de-a7e630caad3b","Type":"ContainerStarted","Data":"0375897e15c4dd7ecbe37dd8c4b9b34d616b183dfdff1667d0aa74cb777bce21"} Mar 13 12:12:13 crc kubenswrapper[4786]: I0313 12:12:13.515370 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.515350924 podStartE2EDuration="2.515350924s" podCreationTimestamp="2026-03-13 12:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:13.498339103 +0000 UTC m=+1520.777992560" watchObservedRunningTime="2026-03-13 12:12:13.515350924 +0000 UTC m=+1520.795004371" Mar 13 12:12:13 crc kubenswrapper[4786]: I0313 12:12:13.588845 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 12:12:14 crc kubenswrapper[4786]: I0313 12:12:14.484273 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:12:14 crc kubenswrapper[4786]: I0313 12:12:14.485074 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:12:14 crc kubenswrapper[4786]: I0313 12:12:14.487484 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:12:14 crc kubenswrapper[4786]: I0313 12:12:14.488130 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.482374 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.484928 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.693834 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69ffc749-26dmp"] Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.695348 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.714660 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-26dmp"] Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.747530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-nb\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.747625 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-swift-storage-0\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.747673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-svc\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.747716 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27kc\" (UniqueName: \"kubernetes.io/projected/8851cf9e-656d-439d-a0d8-a16bdc843d87-kube-api-access-t27kc\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.747742 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-config\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.747770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-sb\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.848621 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-config\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.848667 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-sb\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.848703 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-nb\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.848822 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-swift-storage-0\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.848872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-svc\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.848937 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27kc\" (UniqueName: \"kubernetes.io/projected/8851cf9e-656d-439d-a0d8-a16bdc843d87-kube-api-access-t27kc\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.849439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-config\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.849745 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-nb\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.849744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-sb\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.849997 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-swift-storage-0\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.850441 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-svc\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:15 crc kubenswrapper[4786]: I0313 12:12:15.875608 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27kc\" (UniqueName: \"kubernetes.io/projected/8851cf9e-656d-439d-a0d8-a16bdc843d87-kube-api-access-t27kc\") pod \"dnsmasq-dns-69ffc749-26dmp\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:16 crc kubenswrapper[4786]: I0313 12:12:16.050367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:16 crc kubenswrapper[4786]: I0313 12:12:16.548478 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-26dmp"] Mar 13 12:12:16 crc kubenswrapper[4786]: W0313 12:12:16.549014 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8851cf9e_656d_439d_a0d8_a16bdc843d87.slice/crio-19c390b8d90e14fda47c66027733e1779bad7b3e6676beda27b49c7149f50586 WatchSource:0}: Error finding container 19c390b8d90e14fda47c66027733e1779bad7b3e6676beda27b49c7149f50586: Status 404 returned error can't find the container with id 19c390b8d90e14fda47c66027733e1779bad7b3e6676beda27b49c7149f50586 Mar 13 12:12:17 crc kubenswrapper[4786]: I0313 12:12:17.088641 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:17 crc kubenswrapper[4786]: I0313 12:12:17.501273 4786 generic.go:334] "Generic (PLEG): container finished" podID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerID="2fe2f63b3058a5332c6e4555092190fe27dedba1ec7e07e481f9f8d9908c7412" exitCode=0 Mar 13 12:12:17 crc kubenswrapper[4786]: I0313 12:12:17.501313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-26dmp" event={"ID":"8851cf9e-656d-439d-a0d8-a16bdc843d87","Type":"ContainerDied","Data":"2fe2f63b3058a5332c6e4555092190fe27dedba1ec7e07e481f9f8d9908c7412"} Mar 13 12:12:17 crc kubenswrapper[4786]: I0313 12:12:17.501654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-26dmp" event={"ID":"8851cf9e-656d-439d-a0d8-a16bdc843d87","Type":"ContainerStarted","Data":"19c390b8d90e14fda47c66027733e1779bad7b3e6676beda27b49c7149f50586"} Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.076843 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.077427 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-central-agent" containerID="cri-o://aa0e470bda35a03af8c66fee311a44ee6d872ab12f8e892d99037221a796c9b2" gracePeriod=30 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.077495 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="proxy-httpd" containerID="cri-o://54adeada9579b0b6b36af08b39ac2348320ad9ad09b29d79ce79bbcda106787f" gracePeriod=30 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.077582 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-notification-agent" containerID="cri-o://ff58b2de16ed9a51881f71d86c96837081f8fcc0dbee6c16cf7ba252861c314e" gracePeriod=30 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.077582 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="sg-core" containerID="cri-o://8e3087f5d62be2a686205362bb92d0458ee30cacc59008a101d71f907f7aadcc" gracePeriod=30 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.544558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-26dmp" event={"ID":"8851cf9e-656d-439d-a0d8-a16bdc843d87","Type":"ContainerStarted","Data":"894cb93b5b88c9cda5163ee38d365a1d2b6f2902380b8a636a78fe6a94bbf669"} Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.545699 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.576763 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69ffc749-26dmp" podStartSLOduration=3.5767492990000003 podStartE2EDuration="3.576749299s" podCreationTimestamp="2026-03-13 12:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:18.574684322 +0000 UTC m=+1525.854337769" watchObservedRunningTime="2026-03-13 12:12:18.576749299 +0000 UTC m=+1525.856402746" Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.597600 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerID="54adeada9579b0b6b36af08b39ac2348320ad9ad09b29d79ce79bbcda106787f" exitCode=0 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.597632 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerID="8e3087f5d62be2a686205362bb92d0458ee30cacc59008a101d71f907f7aadcc" exitCode=2 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.597640 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerID="aa0e470bda35a03af8c66fee311a44ee6d872ab12f8e892d99037221a796c9b2" exitCode=0 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.597662 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerDied","Data":"54adeada9579b0b6b36af08b39ac2348320ad9ad09b29d79ce79bbcda106787f"} Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.597687 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerDied","Data":"8e3087f5d62be2a686205362bb92d0458ee30cacc59008a101d71f907f7aadcc"} Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.597699 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerDied","Data":"aa0e470bda35a03af8c66fee311a44ee6d872ab12f8e892d99037221a796c9b2"} Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.765689 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.765902 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-log" containerID="cri-o://afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676" gracePeriod=30 Mar 13 12:12:18 crc kubenswrapper[4786]: I0313 12:12:18.765977 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-api" containerID="cri-o://33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985" gracePeriod=30 Mar 13 12:12:19 crc kubenswrapper[4786]: I0313 12:12:19.632495 4786 generic.go:334] "Generic (PLEG): container finished" podID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerID="afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676" exitCode=143 Mar 13 12:12:19 crc kubenswrapper[4786]: I0313 12:12:19.632809 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b970b959-5fc2-4486-a6af-f931f04f0eb0","Type":"ContainerDied","Data":"afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676"} Mar 13 12:12:19 crc kubenswrapper[4786]: I0313 12:12:19.637900 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerID="ff58b2de16ed9a51881f71d86c96837081f8fcc0dbee6c16cf7ba252861c314e" exitCode=0 Mar 13 12:12:19 crc kubenswrapper[4786]: I0313 12:12:19.638451 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerDied","Data":"ff58b2de16ed9a51881f71d86c96837081f8fcc0dbee6c16cf7ba252861c314e"} Mar 13 12:12:19 crc kubenswrapper[4786]: I0313 12:12:19.846405 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-sg-core-conf-yaml\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034481 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-ceilometer-tls-certs\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034513 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-scripts\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034556 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-combined-ca-bundle\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034571 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-config-data\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034599 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-run-httpd\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034627 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-log-httpd\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.034712 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbnzr\" (UniqueName: \"kubernetes.io/projected/7b9ce452-adfd-4058-afdc-6dd53158cb93-kube-api-access-mbnzr\") pod \"7b9ce452-adfd-4058-afdc-6dd53158cb93\" (UID: \"7b9ce452-adfd-4058-afdc-6dd53158cb93\") " Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.035004 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.035129 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.035197 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.040456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9ce452-adfd-4058-afdc-6dd53158cb93-kube-api-access-mbnzr" (OuterVolumeSpecName: "kube-api-access-mbnzr") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "kube-api-access-mbnzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.046095 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-scripts" (OuterVolumeSpecName: "scripts") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.089489 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.101250 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.118270 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.136946 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.136979 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.136988 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b9ce452-adfd-4058-afdc-6dd53158cb93-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.136997 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbnzr\" (UniqueName: \"kubernetes.io/projected/7b9ce452-adfd-4058-afdc-6dd53158cb93-kube-api-access-mbnzr\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.137007 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.137930 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.146788 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-config-data" (OuterVolumeSpecName: "config-data") pod "7b9ce452-adfd-4058-afdc-6dd53158cb93" (UID: "7b9ce452-adfd-4058-afdc-6dd53158cb93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.167430 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.239127 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.239177 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ce452-adfd-4058-afdc-6dd53158cb93-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.327633 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-62cxs"] Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.647943 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b9ce452-adfd-4058-afdc-6dd53158cb93","Type":"ContainerDied","Data":"f27c1e2473b56cfe31f05c9f69acaf2bb5ab6dd443a461bf2b99af7528b8b17c"} Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.648004 4786 scope.go:117] "RemoveContainer" containerID="54adeada9579b0b6b36af08b39ac2348320ad9ad09b29d79ce79bbcda106787f" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.648174 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.673646 4786 scope.go:117] "RemoveContainer" containerID="8e3087f5d62be2a686205362bb92d0458ee30cacc59008a101d71f907f7aadcc" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.683324 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.690302 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.704408 4786 scope.go:117] "RemoveContainer" containerID="ff58b2de16ed9a51881f71d86c96837081f8fcc0dbee6c16cf7ba252861c314e" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725081 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:20 crc kubenswrapper[4786]: E0313 12:12:20.725651 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-central-agent" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725673 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-central-agent" Mar 13 12:12:20 crc kubenswrapper[4786]: E0313 12:12:20.725694 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="sg-core" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725704 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="sg-core" Mar 13 12:12:20 crc kubenswrapper[4786]: E0313 12:12:20.725726 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-notification-agent" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725735 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-notification-agent" Mar 13 12:12:20 crc kubenswrapper[4786]: E0313 12:12:20.725749 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="proxy-httpd" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725755 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="proxy-httpd" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725945 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-notification-agent" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725963 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="proxy-httpd" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.725987 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="ceilometer-central-agent" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.726004 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" containerName="sg-core" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.731413 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.732719 4786 scope.go:117] "RemoveContainer" containerID="aa0e470bda35a03af8c66fee311a44ee6d872ab12f8e892d99037221a796c9b2" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.734103 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.734545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.736124 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.738560 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.860646 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.860743 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-run-httpd\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.861562 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.861641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwpl9\" (UniqueName: \"kubernetes.io/projected/1fcb8b06-7f98-4c8b-bae2-1bf657791194-kube-api-access-xwpl9\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.861790 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-log-httpd\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.861867 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-scripts\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.862224 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-config-data\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.862315 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.964631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.964728 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwpl9\" (UniqueName: \"kubernetes.io/projected/1fcb8b06-7f98-4c8b-bae2-1bf657791194-kube-api-access-xwpl9\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.964840 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-log-httpd\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.965016 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-scripts\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.965098 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-config-data\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.965338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.965394 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.965450 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-run-httpd\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.965588 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-log-httpd\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.966290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-run-httpd\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.969908 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-config-data\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.970003 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.970591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.970746 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-scripts\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.972815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:20 crc kubenswrapper[4786]: I0313 12:12:20.984452 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwpl9\" (UniqueName: \"kubernetes.io/projected/1fcb8b06-7f98-4c8b-bae2-1bf657791194-kube-api-access-xwpl9\") pod \"ceilometer-0\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " pod="openstack/ceilometer-0" Mar 13 12:12:21 crc kubenswrapper[4786]: I0313 12:12:21.087017 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:12:21 crc kubenswrapper[4786]: I0313 12:12:21.388371 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:12:21 crc kubenswrapper[4786]: W0313 12:12:21.397054 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fcb8b06_7f98_4c8b_bae2_1bf657791194.slice/crio-71453dabac256580b94aacf743b7f04cfc1627ce80961f0084ce61ab9770472e WatchSource:0}: Error finding container 71453dabac256580b94aacf743b7f04cfc1627ce80961f0084ce61ab9770472e: Status 404 returned error can't find the container with id 71453dabac256580b94aacf743b7f04cfc1627ce80961f0084ce61ab9770472e Mar 13 12:12:21 crc kubenswrapper[4786]: I0313 12:12:21.451963 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9ce452-adfd-4058-afdc-6dd53158cb93" path="/var/lib/kubelet/pods/7b9ce452-adfd-4058-afdc-6dd53158cb93/volumes" Mar 13 12:12:21 crc kubenswrapper[4786]: I0313 12:12:21.662662 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerStarted","Data":"71453dabac256580b94aacf743b7f04cfc1627ce80961f0084ce61ab9770472e"} Mar 13 12:12:21 crc kubenswrapper[4786]: I0313 12:12:21.662827 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-62cxs" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="registry-server" containerID="cri-o://662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae" gracePeriod=2 Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.090310 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.129309 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.273475 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.382651 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.397398 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgm7m\" (UniqueName: \"kubernetes.io/projected/b8bad192-9467-4657-b343-1fa02b6f2702-kube-api-access-qgm7m\") pod \"b8bad192-9467-4657-b343-1fa02b6f2702\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.398364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b970b959-5fc2-4486-a6af-f931f04f0eb0-logs\") pod \"b970b959-5fc2-4486-a6af-f931f04f0eb0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.398455 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-catalog-content\") pod \"b8bad192-9467-4657-b343-1fa02b6f2702\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.398555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-utilities\") pod \"b8bad192-9467-4657-b343-1fa02b6f2702\" (UID: \"b8bad192-9467-4657-b343-1fa02b6f2702\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.398726 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-config-data\") pod \"b970b959-5fc2-4486-a6af-f931f04f0eb0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.398801 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-combined-ca-bundle\") pod \"b970b959-5fc2-4486-a6af-f931f04f0eb0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.398876 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqz24\" (UniqueName: \"kubernetes.io/projected/b970b959-5fc2-4486-a6af-f931f04f0eb0-kube-api-access-mqz24\") pod \"b970b959-5fc2-4486-a6af-f931f04f0eb0\" (UID: \"b970b959-5fc2-4486-a6af-f931f04f0eb0\") " Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.400395 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-utilities" (OuterVolumeSpecName: "utilities") pod "b8bad192-9467-4657-b343-1fa02b6f2702" (UID: "b8bad192-9467-4657-b343-1fa02b6f2702"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.402685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b970b959-5fc2-4486-a6af-f931f04f0eb0-logs" (OuterVolumeSpecName: "logs") pod "b970b959-5fc2-4486-a6af-f931f04f0eb0" (UID: "b970b959-5fc2-4486-a6af-f931f04f0eb0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.404402 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b970b959-5fc2-4486-a6af-f931f04f0eb0-kube-api-access-mqz24" (OuterVolumeSpecName: "kube-api-access-mqz24") pod "b970b959-5fc2-4486-a6af-f931f04f0eb0" (UID: "b970b959-5fc2-4486-a6af-f931f04f0eb0"). InnerVolumeSpecName "kube-api-access-mqz24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.433935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bad192-9467-4657-b343-1fa02b6f2702-kube-api-access-qgm7m" (OuterVolumeSpecName: "kube-api-access-qgm7m") pod "b8bad192-9467-4657-b343-1fa02b6f2702" (UID: "b8bad192-9467-4657-b343-1fa02b6f2702"). InnerVolumeSpecName "kube-api-access-qgm7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.441001 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-config-data" (OuterVolumeSpecName: "config-data") pod "b970b959-5fc2-4486-a6af-f931f04f0eb0" (UID: "b970b959-5fc2-4486-a6af-f931f04f0eb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.487116 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b970b959-5fc2-4486-a6af-f931f04f0eb0" (UID: "b970b959-5fc2-4486-a6af-f931f04f0eb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.502197 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.502571 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.502665 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b970b959-5fc2-4486-a6af-f931f04f0eb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.502746 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqz24\" (UniqueName: \"kubernetes.io/projected/b970b959-5fc2-4486-a6af-f931f04f0eb0-kube-api-access-mqz24\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.502824 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgm7m\" (UniqueName: \"kubernetes.io/projected/b8bad192-9467-4657-b343-1fa02b6f2702-kube-api-access-qgm7m\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.502930 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b970b959-5fc2-4486-a6af-f931f04f0eb0-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.590007 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8bad192-9467-4657-b343-1fa02b6f2702" (UID: "b8bad192-9467-4657-b343-1fa02b6f2702"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.603808 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bad192-9467-4657-b343-1fa02b6f2702-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.671726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerStarted","Data":"1b0b367e7cd0a1267707201fcc6eb17e95461077f6d3e9b86822b55b231ea0c0"} Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.674706 4786 generic.go:334] "Generic (PLEG): container finished" podID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerID="33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985" exitCode=0 Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.674796 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b970b959-5fc2-4486-a6af-f931f04f0eb0","Type":"ContainerDied","Data":"33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985"} Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.674939 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b970b959-5fc2-4486-a6af-f931f04f0eb0","Type":"ContainerDied","Data":"407a42e7638f54df847f414745f1fbecd3c06f313ebf055a555e09598c660e22"} Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.674992 4786 scope.go:117] "RemoveContainer" containerID="33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.674837 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.677659 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8bad192-9467-4657-b343-1fa02b6f2702" containerID="662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae" exitCode=0 Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.677916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerDied","Data":"662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae"} Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.677970 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62cxs" event={"ID":"b8bad192-9467-4657-b343-1fa02b6f2702","Type":"ContainerDied","Data":"552609aa275fb8c509e8446bb25f62673a8f6c211476b9d2610f037212bdc8b3"} Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.678118 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62cxs" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.700941 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.714083 4786 scope.go:117] "RemoveContainer" containerID="afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.716806 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.747079 4786 scope.go:117] "RemoveContainer" containerID="33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.748784 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.749091 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985\": container with ID starting with 33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985 not found: ID does not exist" containerID="33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.749141 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985"} err="failed to get container status \"33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985\": rpc error: code = NotFound desc = could not find container \"33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985\": container with ID starting with 33b940ffffb643da4592d0c4cdd98a33c76f7c967ef933ee92ad7a64b22e0985 not found: ID does not exist" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.749169 4786 scope.go:117] "RemoveContainer" containerID="afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.749439 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676\": container with ID starting with afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676 not found: ID does not exist" containerID="afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.749517 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676"} err="failed to get container status \"afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676\": rpc error: code = NotFound desc = could not find container \"afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676\": container with ID starting with afff7724f3179e28cf2b87e52678497cf3ea4ad57fc566e553b341c17bf35676 not found: ID does not exist" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.749590 4786 scope.go:117] "RemoveContainer" containerID="662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.786226 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.786814 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-api" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.786878 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-api" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.787019 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="registry-server" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787069 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="registry-server" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.787174 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="extract-utilities" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787227 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="extract-utilities" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.787278 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-log" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787324 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-log" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.787388 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="extract-content" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787440 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="extract-content" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787668 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-api" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787724 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" containerName="nova-api-log" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.787775 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" containerName="registry-server" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.788758 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.792658 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.792658 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.796240 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.800348 4786 scope.go:117] "RemoveContainer" containerID="81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.806670 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdnt8\" (UniqueName: \"kubernetes.io/projected/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-kube-api-access-hdnt8\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.806739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.806761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.806813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-logs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.806927 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-config-data\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.806977 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-public-tls-certs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.807581 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-62cxs"] Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.827251 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-62cxs"] Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.833024 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.835328 4786 scope.go:117] "RemoveContainer" containerID="4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.871096 4786 scope.go:117] "RemoveContainer" containerID="662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.871580 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae\": container with ID starting with 662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae not found: ID does not exist" containerID="662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.871614 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae"} err="failed to get container status \"662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae\": rpc error: code = NotFound desc = could not find container \"662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae\": container with ID starting with 662e27973839a8b08847dbf3747fcaabf38a8452162467c94c46c3744bf394ae not found: ID does not exist" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.871640 4786 scope.go:117] "RemoveContainer" containerID="81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.871935 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4\": container with ID starting with 81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4 not found: ID does not exist" containerID="81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.871982 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4"} err="failed to get container status \"81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4\": rpc error: code = NotFound desc = could not find container \"81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4\": container with ID starting with 81a9fc41df462409ca5d85c94cff787e7cde3ccca67b709273c1c6adf1788ea4 not found: ID does not exist" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.872015 4786 scope.go:117] "RemoveContainer" containerID="4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293" Mar 13 12:12:22 crc kubenswrapper[4786]: E0313 12:12:22.872369 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293\": container with ID starting with 4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293 not found: ID does not exist" containerID="4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.872403 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293"} err="failed to get container status \"4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293\": rpc error: code = NotFound desc = could not find container \"4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293\": container with ID starting with 4870168bca28a50c96cf5f527335488e61fa5647d3203014989b302946524293 not found: ID does not exist" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.910032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.910333 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.910434 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-logs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.910743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-config-data\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.910868 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-public-tls-certs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.911358 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdnt8\" (UniqueName: \"kubernetes.io/projected/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-kube-api-access-hdnt8\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.912266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-logs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.912619 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dx9bx"] Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.913640 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.918547 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.918710 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.918931 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-public-tls-certs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.926916 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.927370 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.931476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-config-data\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.941023 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx9bx"] Mar 13 12:12:22 crc kubenswrapper[4786]: I0313 12:12:22.952897 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdnt8\" (UniqueName: \"kubernetes.io/projected/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-kube-api-access-hdnt8\") pod \"nova-api-0\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " pod="openstack/nova-api-0" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.110789 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.118844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.118919 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-scripts\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.118973 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-config-data\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.119055 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdkdp\" (UniqueName: \"kubernetes.io/projected/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-kube-api-access-pdkdp\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.220716 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-scripts\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.223367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-config-data\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.223487 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdkdp\" (UniqueName: \"kubernetes.io/projected/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-kube-api-access-pdkdp\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.223612 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.236529 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-scripts\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.239388 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.251304 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-config-data\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.261088 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdkdp\" (UniqueName: \"kubernetes.io/projected/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-kube-api-access-pdkdp\") pod \"nova-cell1-cell-mapping-dx9bx\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: W0313 12:12:23.425039 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56b9f4ad_c86a_43b2_8165_7a30fe4efc49.slice/crio-f5524a092934e54313cf667f2147ffe68d04d9e1f07f5471ae1a7d0b430be192 WatchSource:0}: Error finding container f5524a092934e54313cf667f2147ffe68d04d9e1f07f5471ae1a7d0b430be192: Status 404 returned error can't find the container with id f5524a092934e54313cf667f2147ffe68d04d9e1f07f5471ae1a7d0b430be192 Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.426043 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.456080 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bad192-9467-4657-b343-1fa02b6f2702" path="/var/lib/kubelet/pods/b8bad192-9467-4657-b343-1fa02b6f2702/volumes" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.457866 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b970b959-5fc2-4486-a6af-f931f04f0eb0" path="/var/lib/kubelet/pods/b970b959-5fc2-4486-a6af-f931f04f0eb0/volumes" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.539740 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.707018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56b9f4ad-c86a-43b2-8165-7a30fe4efc49","Type":"ContainerStarted","Data":"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2"} Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.707426 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56b9f4ad-c86a-43b2-8165-7a30fe4efc49","Type":"ContainerStarted","Data":"f5524a092934e54313cf667f2147ffe68d04d9e1f07f5471ae1a7d0b430be192"} Mar 13 12:12:23 crc kubenswrapper[4786]: I0313 12:12:23.709003 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerStarted","Data":"c90607a25b6719b906805f4956767bf9e8f2062f95bfc51dac8f6059d27ae384"} Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.057770 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx9bx"] Mar 13 12:12:24 crc kubenswrapper[4786]: W0313 12:12:24.059186 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23e5b74f_ceb3_4eed_b34c_dac07ec2b3b7.slice/crio-5968daec6ba9bf5c1aed26218d832b3043c965aa4e4209ee43e9c578c382363b WatchSource:0}: Error finding container 5968daec6ba9bf5c1aed26218d832b3043c965aa4e4209ee43e9c578c382363b: Status 404 returned error can't find the container with id 5968daec6ba9bf5c1aed26218d832b3043c965aa4e4209ee43e9c578c382363b Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.720202 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerStarted","Data":"6ca4ed1353fe2122e66b7cdc238326a51066ac9f00f84fe43e52d17f553e850a"} Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.725727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx9bx" event={"ID":"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7","Type":"ContainerStarted","Data":"9aba29aa33189388ae65aa54d679b66d924f38fcf6996f9e7909443730f648f3"} Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.725787 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx9bx" event={"ID":"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7","Type":"ContainerStarted","Data":"5968daec6ba9bf5c1aed26218d832b3043c965aa4e4209ee43e9c578c382363b"} Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.730797 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56b9f4ad-c86a-43b2-8165-7a30fe4efc49","Type":"ContainerStarted","Data":"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc"} Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.746373 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dx9bx" podStartSLOduration=2.746351191 podStartE2EDuration="2.746351191s" podCreationTimestamp="2026-03-13 12:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:24.739216977 +0000 UTC m=+1532.018870444" watchObservedRunningTime="2026-03-13 12:12:24.746351191 +0000 UTC m=+1532.026004648" Mar 13 12:12:24 crc kubenswrapper[4786]: I0313 12:12:24.770699 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.770675251 podStartE2EDuration="2.770675251s" podCreationTimestamp="2026-03-13 12:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:24.762421087 +0000 UTC m=+1532.042074544" watchObservedRunningTime="2026-03-13 12:12:24.770675251 +0000 UTC m=+1532.050328698" Mar 13 12:12:25 crc kubenswrapper[4786]: I0313 12:12:25.750656 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerStarted","Data":"19ed6f38037a55c43058db0a67693dffe38372d306c408426bb30752659582c5"} Mar 13 12:12:25 crc kubenswrapper[4786]: I0313 12:12:25.751106 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:12:25 crc kubenswrapper[4786]: I0313 12:12:25.784040 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.053108255 podStartE2EDuration="5.784021487s" podCreationTimestamp="2026-03-13 12:12:20 +0000 UTC" firstStartedPulling="2026-03-13 12:12:21.399705976 +0000 UTC m=+1528.679359423" lastFinishedPulling="2026-03-13 12:12:25.130619198 +0000 UTC m=+1532.410272655" observedRunningTime="2026-03-13 12:12:25.78158052 +0000 UTC m=+1533.061233967" watchObservedRunningTime="2026-03-13 12:12:25.784021487 +0000 UTC m=+1533.063674944" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.054304 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.156028 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-kl8l2"] Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.156615 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" containerName="dnsmasq-dns" containerID="cri-o://5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a" gracePeriod=10 Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.669366 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.759591 4786 generic.go:334] "Generic (PLEG): container finished" podID="6c25ada4-043b-4351-85c9-87f967f842bb" containerID="5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a" exitCode=0 Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.760555 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.760980 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" event={"ID":"6c25ada4-043b-4351-85c9-87f967f842bb","Type":"ContainerDied","Data":"5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a"} Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.761100 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74b5cfd5-kl8l2" event={"ID":"6c25ada4-043b-4351-85c9-87f967f842bb","Type":"ContainerDied","Data":"a4eadc3e9ca746f43a2ce59feaa21f512713da4d93930dbe6be96cac28306f3d"} Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.761170 4786 scope.go:117] "RemoveContainer" containerID="5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.790380 4786 scope.go:117] "RemoveContainer" containerID="5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.801265 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-config\") pod \"6c25ada4-043b-4351-85c9-87f967f842bb\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.801358 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-svc\") pod \"6c25ada4-043b-4351-85c9-87f967f842bb\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.801403 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-sb\") pod \"6c25ada4-043b-4351-85c9-87f967f842bb\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.801450 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-nb\") pod \"6c25ada4-043b-4351-85c9-87f967f842bb\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.801476 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-swift-storage-0\") pod \"6c25ada4-043b-4351-85c9-87f967f842bb\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.801518 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sww26\" (UniqueName: \"kubernetes.io/projected/6c25ada4-043b-4351-85c9-87f967f842bb-kube-api-access-sww26\") pod \"6c25ada4-043b-4351-85c9-87f967f842bb\" (UID: \"6c25ada4-043b-4351-85c9-87f967f842bb\") " Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.816427 4786 scope.go:117] "RemoveContainer" containerID="5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a" Mar 13 12:12:26 crc kubenswrapper[4786]: E0313 12:12:26.817712 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a\": container with ID starting with 5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a not found: ID does not exist" containerID="5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.817768 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a"} err="failed to get container status \"5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a\": rpc error: code = NotFound desc = could not find container \"5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a\": container with ID starting with 5c6a22b986adb9fa5fa5d95c31545118b673c9bd4c1541c2f21b1e032bf85f3a not found: ID does not exist" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.817800 4786 scope.go:117] "RemoveContainer" containerID="5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b" Mar 13 12:12:26 crc kubenswrapper[4786]: E0313 12:12:26.818159 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b\": container with ID starting with 5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b not found: ID does not exist" containerID="5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.818202 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b"} err="failed to get container status \"5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b\": rpc error: code = NotFound desc = could not find container \"5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b\": container with ID starting with 5f15ceb59e8bf55e936b40aea2d2aa3edbb801d854d6df831ba4c4262a39bb9b not found: ID does not exist" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.836521 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c25ada4-043b-4351-85c9-87f967f842bb-kube-api-access-sww26" (OuterVolumeSpecName: "kube-api-access-sww26") pod "6c25ada4-043b-4351-85c9-87f967f842bb" (UID: "6c25ada4-043b-4351-85c9-87f967f842bb"). InnerVolumeSpecName "kube-api-access-sww26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.854964 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-config" (OuterVolumeSpecName: "config") pod "6c25ada4-043b-4351-85c9-87f967f842bb" (UID: "6c25ada4-043b-4351-85c9-87f967f842bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.855946 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c25ada4-043b-4351-85c9-87f967f842bb" (UID: "6c25ada4-043b-4351-85c9-87f967f842bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.866353 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c25ada4-043b-4351-85c9-87f967f842bb" (UID: "6c25ada4-043b-4351-85c9-87f967f842bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.874405 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c25ada4-043b-4351-85c9-87f967f842bb" (UID: "6c25ada4-043b-4351-85c9-87f967f842bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.878724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c25ada4-043b-4351-85c9-87f967f842bb" (UID: "6c25ada4-043b-4351-85c9-87f967f842bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.904721 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.904759 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.904771 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.904782 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.904794 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c25ada4-043b-4351-85c9-87f967f842bb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:26 crc kubenswrapper[4786]: I0313 12:12:26.904805 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sww26\" (UniqueName: \"kubernetes.io/projected/6c25ada4-043b-4351-85c9-87f967f842bb-kube-api-access-sww26\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:27 crc kubenswrapper[4786]: I0313 12:12:27.107346 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-kl8l2"] Mar 13 12:12:27 crc kubenswrapper[4786]: I0313 12:12:27.119786 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b74b5cfd5-kl8l2"] Mar 13 12:12:27 crc kubenswrapper[4786]: I0313 12:12:27.451147 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" path="/var/lib/kubelet/pods/6c25ada4-043b-4351-85c9-87f967f842bb/volumes" Mar 13 12:12:29 crc kubenswrapper[4786]: I0313 12:12:29.802223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx9bx" event={"ID":"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7","Type":"ContainerDied","Data":"9aba29aa33189388ae65aa54d679b66d924f38fcf6996f9e7909443730f648f3"} Mar 13 12:12:29 crc kubenswrapper[4786]: I0313 12:12:29.802145 4786 generic.go:334] "Generic (PLEG): container finished" podID="23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" containerID="9aba29aa33189388ae65aa54d679b66d924f38fcf6996f9e7909443730f648f3" exitCode=0 Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.218095 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.394909 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-config-data\") pod \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.394995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-combined-ca-bundle\") pod \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.395079 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-scripts\") pod \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.395210 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdkdp\" (UniqueName: \"kubernetes.io/projected/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-kube-api-access-pdkdp\") pod \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\" (UID: \"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7\") " Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.400333 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-scripts" (OuterVolumeSpecName: "scripts") pod "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" (UID: "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.403699 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-kube-api-access-pdkdp" (OuterVolumeSpecName: "kube-api-access-pdkdp") pod "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" (UID: "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7"). InnerVolumeSpecName "kube-api-access-pdkdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.425229 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" (UID: "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.443126 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-config-data" (OuterVolumeSpecName: "config-data") pod "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" (UID: "23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.497515 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdkdp\" (UniqueName: \"kubernetes.io/projected/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-kube-api-access-pdkdp\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.497546 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.497556 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.497564 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:31 crc kubenswrapper[4786]: E0313 12:12:31.650468 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23e5b74f_ceb3_4eed_b34c_dac07ec2b3b7.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.825362 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx9bx" event={"ID":"23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7","Type":"ContainerDied","Data":"5968daec6ba9bf5c1aed26218d832b3043c965aa4e4209ee43e9c578c382363b"} Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.825631 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5968daec6ba9bf5c1aed26218d832b3043c965aa4e4209ee43e9c578c382363b" Mar 13 12:12:31 crc kubenswrapper[4786]: I0313 12:12:31.825670 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx9bx" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.008690 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.009040 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b964d018-9a2e-4174-996e-43d8f690752e" containerName="nova-scheduler-scheduler" containerID="cri-o://0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34" gracePeriod=30 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.020482 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.021064 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-log" containerID="cri-o://071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2" gracePeriod=30 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.021105 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-api" containerID="cri-o://4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc" gracePeriod=30 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.030856 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.031127 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-log" containerID="cri-o://94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600" gracePeriod=30 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.031247 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-metadata" containerID="cri-o://9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24" gracePeriod=30 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.624641 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.818662 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-internal-tls-certs\") pod \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.818974 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-public-tls-certs\") pod \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.819006 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdnt8\" (UniqueName: \"kubernetes.io/projected/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-kube-api-access-hdnt8\") pod \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.819167 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-logs\") pod \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.819193 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-config-data\") pod \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.819220 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-combined-ca-bundle\") pod \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\" (UID: \"56b9f4ad-c86a-43b2-8165-7a30fe4efc49\") " Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.819502 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-logs" (OuterVolumeSpecName: "logs") pod "56b9f4ad-c86a-43b2-8165-7a30fe4efc49" (UID: "56b9f4ad-c86a-43b2-8165-7a30fe4efc49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.820099 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.831372 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-kube-api-access-hdnt8" (OuterVolumeSpecName: "kube-api-access-hdnt8") pod "56b9f4ad-c86a-43b2-8165-7a30fe4efc49" (UID: "56b9f4ad-c86a-43b2-8165-7a30fe4efc49"). InnerVolumeSpecName "kube-api-access-hdnt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.843546 4786 generic.go:334] "Generic (PLEG): container finished" podID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerID="94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600" exitCode=143 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.843609 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88a28074-c7a5-4a91-880b-2e0a28bf0de5","Type":"ContainerDied","Data":"94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600"} Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847242 4786 generic.go:334] "Generic (PLEG): container finished" podID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerID="4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc" exitCode=0 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847283 4786 generic.go:334] "Generic (PLEG): container finished" podID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerID="071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2" exitCode=143 Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847309 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56b9f4ad-c86a-43b2-8165-7a30fe4efc49","Type":"ContainerDied","Data":"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc"} Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847318 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56b9f4ad-c86a-43b2-8165-7a30fe4efc49","Type":"ContainerDied","Data":"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2"} Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56b9f4ad-c86a-43b2-8165-7a30fe4efc49","Type":"ContainerDied","Data":"f5524a092934e54313cf667f2147ffe68d04d9e1f07f5471ae1a7d0b430be192"} Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.847379 4786 scope.go:117] "RemoveContainer" containerID="4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.848789 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-config-data" (OuterVolumeSpecName: "config-data") pod "56b9f4ad-c86a-43b2-8165-7a30fe4efc49" (UID: "56b9f4ad-c86a-43b2-8165-7a30fe4efc49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.867130 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56b9f4ad-c86a-43b2-8165-7a30fe4efc49" (UID: "56b9f4ad-c86a-43b2-8165-7a30fe4efc49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.877471 4786 scope.go:117] "RemoveContainer" containerID="071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.884708 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "56b9f4ad-c86a-43b2-8165-7a30fe4efc49" (UID: "56b9f4ad-c86a-43b2-8165-7a30fe4efc49"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.891686 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "56b9f4ad-c86a-43b2-8165-7a30fe4efc49" (UID: "56b9f4ad-c86a-43b2-8165-7a30fe4efc49"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.899241 4786 scope.go:117] "RemoveContainer" containerID="4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc" Mar 13 12:12:32 crc kubenswrapper[4786]: E0313 12:12:32.899656 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc\": container with ID starting with 4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc not found: ID does not exist" containerID="4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.899757 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc"} err="failed to get container status \"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc\": rpc error: code = NotFound desc = could not find container \"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc\": container with ID starting with 4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc not found: ID does not exist" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.899850 4786 scope.go:117] "RemoveContainer" containerID="071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2" Mar 13 12:12:32 crc kubenswrapper[4786]: E0313 12:12:32.900299 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2\": container with ID starting with 071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2 not found: ID does not exist" containerID="071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.900336 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2"} err="failed to get container status \"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2\": rpc error: code = NotFound desc = could not find container \"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2\": container with ID starting with 071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2 not found: ID does not exist" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.900362 4786 scope.go:117] "RemoveContainer" containerID="4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.900976 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc"} err="failed to get container status \"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc\": rpc error: code = NotFound desc = could not find container \"4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc\": container with ID starting with 4159c4c50c1393a2397075f35a990a7b45425e4ab9144ec5a355b53dce5996cc not found: ID does not exist" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.901008 4786 scope.go:117] "RemoveContainer" containerID="071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.901307 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2"} err="failed to get container status \"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2\": rpc error: code = NotFound desc = could not find container \"071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2\": container with ID starting with 071965aea966ed7f7e8f05ef2037a93b9bcadc5bf7133be4f185075b1f74dde2 not found: ID does not exist" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.921751 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.921787 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.921798 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdnt8\" (UniqueName: \"kubernetes.io/projected/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-kube-api-access-hdnt8\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.921808 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:32 crc kubenswrapper[4786]: I0313 12:12:32.921816 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b9f4ad-c86a-43b2-8165-7a30fe4efc49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.225420 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.256037 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.263942 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.264386 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-log" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264397 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-log" Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.264418 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" containerName="dnsmasq-dns" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264435 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" containerName="dnsmasq-dns" Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.264447 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" containerName="nova-manage" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264453 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" containerName="nova-manage" Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.264461 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" containerName="init" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264466 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" containerName="init" Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.264477 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-api" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264483 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-api" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264653 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c25ada4-043b-4351-85c9-87f967f842bb" containerName="dnsmasq-dns" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264662 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-log" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264683 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" containerName="nova-api-api" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.264696 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" containerName="nova-manage" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.265650 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.268816 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.273332 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.273403 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.280767 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.431270 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4k9\" (UniqueName: \"kubernetes.io/projected/f67ad69d-5191-4d93-9326-b93b0653a82c-kube-api-access-zk4k9\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.431412 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.431479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67ad69d-5191-4d93-9326-b93b0653a82c-logs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.431740 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-config-data\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.431781 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-public-tls-certs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.431835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.453224 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b9f4ad-c86a-43b2-8165-7a30fe4efc49" path="/var/lib/kubelet/pods/56b9f4ad-c86a-43b2-8165-7a30fe4efc49/volumes" Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.507863 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.509856 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.511373 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:12:33 crc kubenswrapper[4786]: E0313 12:12:33.511475 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b964d018-9a2e-4174-996e-43d8f690752e" containerName="nova-scheduler-scheduler" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.533113 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-config-data\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.533352 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-public-tls-certs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.533446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.533555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4k9\" (UniqueName: \"kubernetes.io/projected/f67ad69d-5191-4d93-9326-b93b0653a82c-kube-api-access-zk4k9\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.533717 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.533835 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67ad69d-5191-4d93-9326-b93b0653a82c-logs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.534368 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67ad69d-5191-4d93-9326-b93b0653a82c-logs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.539010 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-public-tls-certs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.539214 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.539214 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-config-data\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.540037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.557620 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4k9\" (UniqueName: \"kubernetes.io/projected/f67ad69d-5191-4d93-9326-b93b0653a82c-kube-api-access-zk4k9\") pod \"nova-api-0\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " pod="openstack/nova-api-0" Mar 13 12:12:33 crc kubenswrapper[4786]: I0313 12:12:33.587924 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:12:34 crc kubenswrapper[4786]: I0313 12:12:34.059022 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:12:34 crc kubenswrapper[4786]: I0313 12:12:34.873291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67ad69d-5191-4d93-9326-b93b0653a82c","Type":"ContainerStarted","Data":"8d638dd580ac1c508eca3ff370e5d3dd8062fb913eb5b6a7194fb27153cf2701"} Mar 13 12:12:34 crc kubenswrapper[4786]: I0313 12:12:34.873640 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67ad69d-5191-4d93-9326-b93b0653a82c","Type":"ContainerStarted","Data":"bde1fb753ad4d23d69552efc3946e3c4ff275d991136e6e6fc724f42a4350c75"} Mar 13 12:12:34 crc kubenswrapper[4786]: I0313 12:12:34.873658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67ad69d-5191-4d93-9326-b93b0653a82c","Type":"ContainerStarted","Data":"bf5334205b37b6746d50b1f253519d2ca194eba19b9241b1a6d5f55131cafa2d"} Mar 13 12:12:34 crc kubenswrapper[4786]: I0313 12:12:34.906614 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.906592726 podStartE2EDuration="1.906592726s" podCreationTimestamp="2026-03-13 12:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:34.89717853 +0000 UTC m=+1542.176831997" watchObservedRunningTime="2026-03-13 12:12:34.906592726 +0000 UTC m=+1542.186246173" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.219043 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:34230->10.217.0.199:8775: read: connection reset by peer" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.219092 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:34242->10.217.0.199:8775: read: connection reset by peer" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.658607 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.789204 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-config-data\") pod \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.789272 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwcrp\" (UniqueName: \"kubernetes.io/projected/88a28074-c7a5-4a91-880b-2e0a28bf0de5-kube-api-access-vwcrp\") pod \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.789348 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-nova-metadata-tls-certs\") pod \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.789485 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a28074-c7a5-4a91-880b-2e0a28bf0de5-logs\") pod \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.789561 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-combined-ca-bundle\") pod \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\" (UID: \"88a28074-c7a5-4a91-880b-2e0a28bf0de5\") " Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.790146 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a28074-c7a5-4a91-880b-2e0a28bf0de5-logs" (OuterVolumeSpecName: "logs") pod "88a28074-c7a5-4a91-880b-2e0a28bf0de5" (UID: "88a28074-c7a5-4a91-880b-2e0a28bf0de5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.790420 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a28074-c7a5-4a91-880b-2e0a28bf0de5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.809127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a28074-c7a5-4a91-880b-2e0a28bf0de5-kube-api-access-vwcrp" (OuterVolumeSpecName: "kube-api-access-vwcrp") pod "88a28074-c7a5-4a91-880b-2e0a28bf0de5" (UID: "88a28074-c7a5-4a91-880b-2e0a28bf0de5"). InnerVolumeSpecName "kube-api-access-vwcrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.829806 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-config-data" (OuterVolumeSpecName: "config-data") pod "88a28074-c7a5-4a91-880b-2e0a28bf0de5" (UID: "88a28074-c7a5-4a91-880b-2e0a28bf0de5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.842947 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a28074-c7a5-4a91-880b-2e0a28bf0de5" (UID: "88a28074-c7a5-4a91-880b-2e0a28bf0de5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.870082 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "88a28074-c7a5-4a91-880b-2e0a28bf0de5" (UID: "88a28074-c7a5-4a91-880b-2e0a28bf0de5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.887521 4786 generic.go:334] "Generic (PLEG): container finished" podID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerID="9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24" exitCode=0 Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.888692 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.897130 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88a28074-c7a5-4a91-880b-2e0a28bf0de5","Type":"ContainerDied","Data":"9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24"} Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.897201 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88a28074-c7a5-4a91-880b-2e0a28bf0de5","Type":"ContainerDied","Data":"a41f81d93f9ee9ba1baff99643bdd6cc58b19825ee684844ccb30999badf5437"} Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.897224 4786 scope.go:117] "RemoveContainer" containerID="9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.898192 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.898226 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.898239 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwcrp\" (UniqueName: \"kubernetes.io/projected/88a28074-c7a5-4a91-880b-2e0a28bf0de5-kube-api-access-vwcrp\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.898254 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a28074-c7a5-4a91-880b-2e0a28bf0de5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.938952 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.947102 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.949038 4786 scope.go:117] "RemoveContainer" containerID="94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.991934 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:12:35 crc kubenswrapper[4786]: E0313 12:12:35.992346 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-log" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.992359 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-log" Mar 13 12:12:35 crc kubenswrapper[4786]: E0313 12:12:35.992392 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-metadata" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.992398 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-metadata" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.992584 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-metadata" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.992598 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" containerName="nova-metadata-log" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.993569 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:12:35 crc kubenswrapper[4786]: I0313 12:12:35.997440 4786 scope.go:117] "RemoveContainer" containerID="9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24" Mar 13 12:12:36 crc kubenswrapper[4786]: E0313 12:12:36.003316 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24\": container with ID starting with 9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24 not found: ID does not exist" containerID="9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.003358 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24"} err="failed to get container status \"9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24\": rpc error: code = NotFound desc = could not find container \"9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24\": container with ID starting with 9b981d97a9f13e1c60652a7d0603a7b903ccaa605ba308c4d88f04ce7b7c6d24 not found: ID does not exist" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.003383 4786 scope.go:117] "RemoveContainer" containerID="94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.003750 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.003954 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.005464 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-config-data\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.005498 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-logs\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.005528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6t64\" (UniqueName: \"kubernetes.io/projected/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-kube-api-access-f6t64\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.005603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.005621 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: E0313 12:12:36.017319 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600\": container with ID starting with 94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600 not found: ID does not exist" containerID="94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.017364 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600"} err="failed to get container status \"94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600\": rpc error: code = NotFound desc = could not find container \"94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600\": container with ID starting with 94c4b1805a8667cf0a9eb3f89b7fd00e5e00189c1fd32dc6abb1a0cc7d79f600 not found: ID does not exist" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.048947 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.106851 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-config-data\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.107137 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-logs\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.107165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6t64\" (UniqueName: \"kubernetes.io/projected/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-kube-api-access-f6t64\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.107219 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.107238 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.107683 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-logs\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.110674 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.110744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.111396 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-config-data\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.123778 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6t64\" (UniqueName: \"kubernetes.io/projected/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-kube-api-access-f6t64\") pod \"nova-metadata-0\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.343867 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.886831 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:12:36 crc kubenswrapper[4786]: W0313 12:12:36.887801 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a06f1e9_ddda_42a5_ab33_88473c56a6c7.slice/crio-e6184906d6200c8db201c66032f8cbd84a03cf95163401dd92418357eaba8f81 WatchSource:0}: Error finding container e6184906d6200c8db201c66032f8cbd84a03cf95163401dd92418357eaba8f81: Status 404 returned error can't find the container with id e6184906d6200c8db201c66032f8cbd84a03cf95163401dd92418357eaba8f81 Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.900690 4786 generic.go:334] "Generic (PLEG): container finished" podID="b964d018-9a2e-4174-996e-43d8f690752e" containerID="0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34" exitCode=0 Mar 13 12:12:36 crc kubenswrapper[4786]: I0313 12:12:36.900763 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b964d018-9a2e-4174-996e-43d8f690752e","Type":"ContainerDied","Data":"0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34"} Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.166451 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.332180 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2m8s\" (UniqueName: \"kubernetes.io/projected/b964d018-9a2e-4174-996e-43d8f690752e-kube-api-access-m2m8s\") pod \"b964d018-9a2e-4174-996e-43d8f690752e\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.332643 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-combined-ca-bundle\") pod \"b964d018-9a2e-4174-996e-43d8f690752e\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.332990 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-config-data\") pod \"b964d018-9a2e-4174-996e-43d8f690752e\" (UID: \"b964d018-9a2e-4174-996e-43d8f690752e\") " Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.343075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b964d018-9a2e-4174-996e-43d8f690752e-kube-api-access-m2m8s" (OuterVolumeSpecName: "kube-api-access-m2m8s") pod "b964d018-9a2e-4174-996e-43d8f690752e" (UID: "b964d018-9a2e-4174-996e-43d8f690752e"). InnerVolumeSpecName "kube-api-access-m2m8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.364675 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-config-data" (OuterVolumeSpecName: "config-data") pod "b964d018-9a2e-4174-996e-43d8f690752e" (UID: "b964d018-9a2e-4174-996e-43d8f690752e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.364715 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b964d018-9a2e-4174-996e-43d8f690752e" (UID: "b964d018-9a2e-4174-996e-43d8f690752e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.435239 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.435306 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964d018-9a2e-4174-996e-43d8f690752e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.435331 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2m8s\" (UniqueName: \"kubernetes.io/projected/b964d018-9a2e-4174-996e-43d8f690752e-kube-api-access-m2m8s\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.453065 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a28074-c7a5-4a91-880b-2e0a28bf0de5" path="/var/lib/kubelet/pods/88a28074-c7a5-4a91-880b-2e0a28bf0de5/volumes" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.914597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1a06f1e9-ddda-42a5-ab33-88473c56a6c7","Type":"ContainerStarted","Data":"0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77"} Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.914665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1a06f1e9-ddda-42a5-ab33-88473c56a6c7","Type":"ContainerStarted","Data":"8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2"} Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.914684 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1a06f1e9-ddda-42a5-ab33-88473c56a6c7","Type":"ContainerStarted","Data":"e6184906d6200c8db201c66032f8cbd84a03cf95163401dd92418357eaba8f81"} Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.917261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b964d018-9a2e-4174-996e-43d8f690752e","Type":"ContainerDied","Data":"a4c44926c573169c92a0635e0bc2f5df32c2fbe6bb27d7fe2aa762a0ea94cacf"} Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.917322 4786 scope.go:117] "RemoveContainer" containerID="0171b1bae4f274c3a18b8f31a3de6718c2032d1a6934705bb45ce447b13b2e34" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.917354 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.948603 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.948570255 podStartE2EDuration="2.948570255s" podCreationTimestamp="2026-03-13 12:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:37.94244977 +0000 UTC m=+1545.222103227" watchObservedRunningTime="2026-03-13 12:12:37.948570255 +0000 UTC m=+1545.228223692" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.965177 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.976671 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.993302 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:12:37 crc kubenswrapper[4786]: E0313 12:12:37.994096 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b964d018-9a2e-4174-996e-43d8f690752e" containerName="nova-scheduler-scheduler" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.994143 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b964d018-9a2e-4174-996e-43d8f690752e" containerName="nova-scheduler-scheduler" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.994510 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b964d018-9a2e-4174-996e-43d8f690752e" containerName="nova-scheduler-scheduler" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.995556 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:12:37 crc kubenswrapper[4786]: I0313 12:12:37.998107 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.002538 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.148528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9jp\" (UniqueName: \"kubernetes.io/projected/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-kube-api-access-tn9jp\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.148928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.148993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-config-data\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.249817 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.249940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-config-data\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.250062 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9jp\" (UniqueName: \"kubernetes.io/projected/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-kube-api-access-tn9jp\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.255437 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.266791 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-config-data\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.269478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9jp\" (UniqueName: \"kubernetes.io/projected/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-kube-api-access-tn9jp\") pod \"nova-scheduler-0\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.324541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.818569 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:12:38 crc kubenswrapper[4786]: I0313 12:12:38.947447 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659","Type":"ContainerStarted","Data":"5aa3f69b0300a396468330831e52f26e3a75fe7a71548e3cfc302e42bc91bfcc"} Mar 13 12:12:39 crc kubenswrapper[4786]: I0313 12:12:39.459824 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b964d018-9a2e-4174-996e-43d8f690752e" path="/var/lib/kubelet/pods/b964d018-9a2e-4174-996e-43d8f690752e/volumes" Mar 13 12:12:39 crc kubenswrapper[4786]: I0313 12:12:39.960761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659","Type":"ContainerStarted","Data":"baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86"} Mar 13 12:12:39 crc kubenswrapper[4786]: I0313 12:12:39.996843 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.996815572 podStartE2EDuration="2.996815572s" podCreationTimestamp="2026-03-13 12:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:12:39.983535232 +0000 UTC m=+1547.263188719" watchObservedRunningTime="2026-03-13 12:12:39.996815572 +0000 UTC m=+1547.276469059" Mar 13 12:12:41 crc kubenswrapper[4786]: I0313 12:12:41.345166 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:12:41 crc kubenswrapper[4786]: I0313 12:12:41.346470 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:12:43 crc kubenswrapper[4786]: I0313 12:12:43.325028 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:12:43 crc kubenswrapper[4786]: I0313 12:12:43.588180 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:12:43 crc kubenswrapper[4786]: I0313 12:12:43.588256 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:12:44 crc kubenswrapper[4786]: I0313 12:12:44.603060 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:12:44 crc kubenswrapper[4786]: I0313 12:12:44.603112 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:12:46 crc kubenswrapper[4786]: I0313 12:12:46.345906 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:12:46 crc kubenswrapper[4786]: I0313 12:12:46.346215 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:12:47 crc kubenswrapper[4786]: I0313 12:12:47.372077 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:12:47 crc kubenswrapper[4786]: I0313 12:12:47.372077 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:12:48 crc kubenswrapper[4786]: I0313 12:12:48.325577 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:12:48 crc kubenswrapper[4786]: I0313 12:12:48.354381 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:12:49 crc kubenswrapper[4786]: I0313 12:12:49.100351 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:12:51 crc kubenswrapper[4786]: I0313 12:12:51.105872 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 12:12:53 crc kubenswrapper[4786]: I0313 12:12:53.599334 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:12:53 crc kubenswrapper[4786]: I0313 12:12:53.601052 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:12:53 crc kubenswrapper[4786]: I0313 12:12:53.601773 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:12:53 crc kubenswrapper[4786]: I0313 12:12:53.606854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:12:54 crc kubenswrapper[4786]: I0313 12:12:54.100242 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:12:54 crc kubenswrapper[4786]: I0313 12:12:54.105990 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:12:56 crc kubenswrapper[4786]: I0313 12:12:56.349778 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:12:56 crc kubenswrapper[4786]: I0313 12:12:56.359946 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:12:56 crc kubenswrapper[4786]: I0313 12:12:56.360845 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:12:57 crc kubenswrapper[4786]: I0313 12:12:57.133740 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:13:14 crc kubenswrapper[4786]: I0313 12:13:14.973241 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 12:13:14 crc kubenswrapper[4786]: I0313 12:13:14.974043 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c9cf93cd-d636-4947-8318-0fade89f65d7" containerName="openstackclient" containerID="cri-o://2196dbffe1af6cbceffdff832f9438466b94634b242cc388d431fdf75e8ded98" gracePeriod=2 Mar 13 12:13:14 crc kubenswrapper[4786]: I0313 12:13:14.992397 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.151204 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gpn45"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.181147 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gpn45"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.241063 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.317594 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3078-account-create-update-n6pkt"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.345277 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3078-account-create-update-n6pkt"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.490494 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6328ef-79d0-4db2-a172-1e2bbd1f8923" path="/var/lib/kubelet/pods/6e6328ef-79d0-4db2-a172-1e2bbd1f8923/volumes" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.491353 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d04659-fd3d-4668-9ad5-51824cb9760a" path="/var/lib/kubelet/pods/b7d04659-fd3d-4668-9ad5-51824cb9760a/volumes" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.492030 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-da6e-account-create-update-56fnk"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.527803 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-da6e-account-create-update-56fnk"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.610053 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.633335 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hncnh"] Mar 13 12:13:15 crc kubenswrapper[4786]: E0313 12:13:15.633800 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cf93cd-d636-4947-8318-0fade89f65d7" containerName="openstackclient" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.633824 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cf93cd-d636-4947-8318-0fade89f65d7" containerName="openstackclient" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.634056 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cf93cd-d636-4947-8318-0fade89f65d7" containerName="openstackclient" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.634626 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.644364 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.672616 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hncnh"] Mar 13 12:13:15 crc kubenswrapper[4786]: E0313 12:13:15.688948 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 13 12:13:15 crc kubenswrapper[4786]: E0313 12:13:15.689027 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data podName:53fea24b-7ca8-4c0a-96d1-458ca1e877a7 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:16.188998919 +0000 UTC m=+1583.468652366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data") pod "rabbitmq-server-0" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7") : configmap "rabbitmq-config-data" not found Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.697938 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.698652 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="openstack-network-exporter" containerID="cri-o://6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b" gracePeriod=300 Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.728500 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3078-account-create-update-mxjs7"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.729873 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.736084 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.744972 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3078-account-create-update-mxjs7"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.790701 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7k2\" (UniqueName: \"kubernetes.io/projected/7effb60f-4f63-48d0-8b3e-1792e39c79d5-kube-api-access-9q7k2\") pod \"root-account-create-update-hncnh\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.790780 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7effb60f-4f63-48d0-8b3e-1792e39c79d5-operator-scripts\") pod \"root-account-create-update-hncnh\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.790801 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530d22ba-9371-4850-8c78-26323a26ad06-operator-scripts\") pod \"barbican-3078-account-create-update-mxjs7\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.790842 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqs7z\" (UniqueName: \"kubernetes.io/projected/530d22ba-9371-4850-8c78-26323a26ad06-kube-api-access-pqs7z\") pod \"barbican-3078-account-create-update-mxjs7\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.823268 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.823852 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="openstack-network-exporter" containerID="cri-o://75127e67378a2a6d7c4e145c4a096adfcfcfcab00c002649d437ac56addedfda" gracePeriod=300 Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.858319 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-x5ns6"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.895948 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-x5ns6"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.897051 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqs7z\" (UniqueName: \"kubernetes.io/projected/530d22ba-9371-4850-8c78-26323a26ad06-kube-api-access-pqs7z\") pod \"barbican-3078-account-create-update-mxjs7\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.897185 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7k2\" (UniqueName: \"kubernetes.io/projected/7effb60f-4f63-48d0-8b3e-1792e39c79d5-kube-api-access-9q7k2\") pod \"root-account-create-update-hncnh\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.897440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7effb60f-4f63-48d0-8b3e-1792e39c79d5-operator-scripts\") pod \"root-account-create-update-hncnh\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.897466 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530d22ba-9371-4850-8c78-26323a26ad06-operator-scripts\") pod \"barbican-3078-account-create-update-mxjs7\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.898241 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530d22ba-9371-4850-8c78-26323a26ad06-operator-scripts\") pod \"barbican-3078-account-create-update-mxjs7\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.899016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7effb60f-4f63-48d0-8b3e-1792e39c79d5-operator-scripts\") pod \"root-account-create-update-hncnh\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.907983 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-666xn"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.943954 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-mpvpt"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.944199 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-mpvpt" podUID="5e05d002-d224-4a13-8497-fc49712f7084" containerName="openstack-network-exporter" containerID="cri-o://e7effd0a662e6015905b4c8a787a7e2eab9bffc913dcf3ece7c373f8bf960414" gracePeriod=30 Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.963975 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tpch6"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.976518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqs7z\" (UniqueName: \"kubernetes.io/projected/530d22ba-9371-4850-8c78-26323a26ad06-kube-api-access-pqs7z\") pod \"barbican-3078-account-create-update-mxjs7\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.978427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7k2\" (UniqueName: \"kubernetes.io/projected/7effb60f-4f63-48d0-8b3e-1792e39c79d5-kube-api-access-9q7k2\") pod \"root-account-create-update-hncnh\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.987773 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.996556 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.996852 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="ovn-northd" containerID="cri-o://73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" gracePeriod=30 Mar 13 12:13:15 crc kubenswrapper[4786]: I0313 12:13:15.996985 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="openstack-network-exporter" containerID="cri-o://9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.007176 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-26dmp"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.007465 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69ffc749-26dmp" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerName="dnsmasq-dns" containerID="cri-o://894cb93b5b88c9cda5163ee38d365a1d2b6f2902380b8a636a78fe6a94bbf669" gracePeriod=10 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.035941 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-prqp8"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.047314 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-prqp8"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.058087 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.125231 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="ovsdbserver-nb" containerID="cri-o://9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01" gracePeriod=300 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.197638 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5197-account-create-update-6g4d9"] Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.202542 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.202626 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data podName:53fea24b-7ca8-4c0a-96d1-458ca1e877a7 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:17.202607175 +0000 UTC m=+1584.482260622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data") pod "rabbitmq-server-0" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7") : configmap "rabbitmq-config-data" not found Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.223235 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="ovsdbserver-sb" containerID="cri-o://cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f" gracePeriod=300 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.226458 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5197-account-create-update-6g4d9"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.270235 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fef1-account-create-update-qx49r"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.289370 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fef1-account-create-update-qx49r"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.295503 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-59gr5"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.321685 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-59gr5"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.331517 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5rb8r"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.345152 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-88c9bb894-zzsvv"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.345486 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-88c9bb894-zzsvv" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-log" containerID="cri-o://f34dd912eb47d002fd56518d38540c1994f7c17513d2933e712f79bc0fca64c8" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.345978 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-88c9bb894-zzsvv" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-api" containerID="cri-o://52dda96d5af850effe132c05ad903de42b92e2b4d4cba2475c20cf70be8ffde6" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.362353 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5rb8r"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.398983 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8shl7"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.443114 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8shl7"] Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.449704 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01 is running failed: container process not found" containerID="9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.451461 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a37be46-7b90-4c56-8dcf-a3ea45123df8/ovsdbserver-nb/0.log" Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.451500 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerID="75127e67378a2a6d7c4e145c4a096adfcfcfcab00c002649d437ac56addedfda" exitCode=2 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.451514 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerID="9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01" exitCode=143 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.451552 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a37be46-7b90-4c56-8dcf-a3ea45123df8","Type":"ContainerDied","Data":"75127e67378a2a6d7c4e145c4a096adfcfcfcab00c002649d437ac56addedfda"} Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.451577 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a37be46-7b90-4c56-8dcf-a3ea45123df8","Type":"ContainerDied","Data":"9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01"} Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.455602 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01 is running failed: container process not found" containerID="9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.458224 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01 is running failed: container process not found" containerID="9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.458340 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="ovsdbserver-nb" Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.466450 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerID="9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc" exitCode=2 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.466549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b25a4cb-7b76-4863-9085-67f99d81f569","Type":"ContainerDied","Data":"9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc"} Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.482731 4786 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder-scheduler-0" secret="" err="secret \"cinder-cinder-dockercfg-8pp56\" not found" Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.492056 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac9a-account-create-update-rbc5m"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.501345 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ac9a-account-create-update-rbc5m"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.501941 4786 generic.go:334] "Generic (PLEG): container finished" podID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerID="894cb93b5b88c9cda5163ee38d365a1d2b6f2902380b8a636a78fe6a94bbf669" exitCode=0 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.502012 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-26dmp" event={"ID":"8851cf9e-656d-439d-a0d8-a16bdc843d87","Type":"ContainerDied","Data":"894cb93b5b88c9cda5163ee38d365a1d2b6f2902380b8a636a78fe6a94bbf669"} Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.535680 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e0cbbc8-b706-4a93-bd1b-442a68cce24b/ovsdbserver-sb/0.log" Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.535722 4786 generic.go:334] "Generic (PLEG): container finished" podID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerID="6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b" exitCode=2 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.535759 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e0cbbc8-b706-4a93-bd1b-442a68cce24b","Type":"ContainerDied","Data":"6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b"} Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.591946 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-lrq7b"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.609151 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-lrq7b"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.626797 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.627331 4786 secret.go:188] Couldn't get secret openstack/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.627357 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-server" containerID="cri-o://8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.627391 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:17.12737486 +0000 UTC m=+1584.407028297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scheduler-config-data" not found Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.627745 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="swift-recon-cron" containerID="cri-o://d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.627793 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="rsync" containerID="cri-o://1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.627827 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-expirer" containerID="cri-o://3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.627937 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-updater" containerID="cri-o://1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.627997 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-auditor" containerID="cri-o://80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628029 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-replicator" containerID="cri-o://3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628060 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-server" containerID="cri-o://e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628094 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-updater" containerID="cri-o://a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628137 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-auditor" containerID="cri-o://18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628182 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-replicator" containerID="cri-o://cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628214 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-server" containerID="cri-o://7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628242 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-reaper" containerID="cri-o://bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628269 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-auditor" containerID="cri-o://82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.628301 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-replicator" containerID="cri-o://090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.628421 4786 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.628584 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:17.128546422 +0000 UTC m=+1584.408199869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-config-data" not found Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.628609 4786 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 13 12:13:16 crc kubenswrapper[4786]: E0313 12:13:16.628649 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:17.128635224 +0000 UTC m=+1584.408288661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scripts" not found Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.637006 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.807592 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="rabbitmq" containerID="cri-o://6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943" gracePeriod=604800 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.850195 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrgnd"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.862849 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrgnd"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.870154 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ee17-account-create-update-5l6xl"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.891403 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ee17-account-create-update-5l6xl"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.905867 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-91d2-account-create-update-kmjpl"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.928248 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-91d2-account-create-update-kmjpl"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.943032 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx9bx"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.956150 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx9bx"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.988338 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6577bdf497-p2bmr"] Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.988599 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6577bdf497-p2bmr" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-api" containerID="cri-o://1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555" gracePeriod=30 Mar 13 12:13:16 crc kubenswrapper[4786]: I0313 12:13:16.989010 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6577bdf497-p2bmr" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-httpd" containerID="cri-o://617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.005792 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.048049 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rj4n5"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.048098 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rj4n5"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.075131 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.075342 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api-log" containerID="cri-o://9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.075680 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api" containerID="cri-o://be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.113408 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.151341 4786 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.151409 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:18.151393758 +0000 UTC m=+1585.431047205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scripts" not found Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.151450 4786 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.151469 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:18.15146327 +0000 UTC m=+1585.431116717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-config-data" not found Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.151501 4786 secret.go:188] Couldn't get secret openstack/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.151517 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:18.151511681 +0000 UTC m=+1585.431165128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scheduler-config-data" not found Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.158738 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tjh4s"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.169013 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.169280 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-log" containerID="cri-o://bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.169533 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-httpd" containerID="cri-o://00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.185526 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tjh4s"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.202737 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d64c-account-create-update-zbfj6"] Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.253551 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.253723 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data podName:53fea24b-7ca8-4c0a-96d1-458ca1e877a7 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:19.253704644 +0000 UTC m=+1586.533358091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data") pod "rabbitmq-server-0" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7") : configmap "rabbitmq-config-data" not found Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.281125 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d64c-account-create-update-zbfj6"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.306679 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="rabbitmq" containerID="cri-o://8546d15d615043030d104f666fcccae710b91eaabc4b545097a038170b3a7dcf" gracePeriod=604800 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.324979 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.325498 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-log" containerID="cri-o://8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.326034 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-metadata" containerID="cri-o://0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.343287 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.353104 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.353411 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-log" containerID="cri-o://b9aec14b391a1bbbd8f466a3df6625873e9ec6de58fe63728da3a16855652999" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.354649 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-httpd" containerID="cri-o://2a0a684c9b4c3a0e217496dbf85c36bb9e3e0ef4e9a768baeea5f590927cf39d" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.370062 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-467fr"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.383179 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-467fr"] Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.400408 4786 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 13 12:13:17 crc kubenswrapper[4786]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 13 12:13:17 crc kubenswrapper[4786]: + source /usr/local/bin/container-scripts/functions Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNBridge=br-int Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNRemote=tcp:localhost:6642 Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNEncapType=geneve Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNAvailabilityZones= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ EnableChassisAsGateway=true Mar 13 12:13:17 crc kubenswrapper[4786]: ++ PhysicalNetworks= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNHostName= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 13 12:13:17 crc kubenswrapper[4786]: ++ ovs_dir=/var/lib/openvswitch Mar 13 12:13:17 crc kubenswrapper[4786]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 13 12:13:17 crc kubenswrapper[4786]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 13 12:13:17 crc kubenswrapper[4786]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + sleep 0.5 Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + sleep 0.5 Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + cleanup_ovsdb_server_semaphore Mar 13 12:13:17 crc kubenswrapper[4786]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 13 12:13:17 crc kubenswrapper[4786]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 13 12:13:17 crc kubenswrapper[4786]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tpch6" message=< Mar 13 12:13:17 crc kubenswrapper[4786]: Exiting ovsdb-server (5) [ OK ] Mar 13 12:13:17 crc kubenswrapper[4786]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 13 12:13:17 crc kubenswrapper[4786]: + source /usr/local/bin/container-scripts/functions Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNBridge=br-int Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNRemote=tcp:localhost:6642 Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNEncapType=geneve Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNAvailabilityZones= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ EnableChassisAsGateway=true Mar 13 12:13:17 crc kubenswrapper[4786]: ++ PhysicalNetworks= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNHostName= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 13 12:13:17 crc kubenswrapper[4786]: ++ ovs_dir=/var/lib/openvswitch Mar 13 12:13:17 crc kubenswrapper[4786]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 13 12:13:17 crc kubenswrapper[4786]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 13 12:13:17 crc kubenswrapper[4786]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + sleep 0.5 Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + sleep 0.5 Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + cleanup_ovsdb_server_semaphore Mar 13 12:13:17 crc kubenswrapper[4786]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 13 12:13:17 crc kubenswrapper[4786]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 13 12:13:17 crc kubenswrapper[4786]: > Mar 13 12:13:17 crc kubenswrapper[4786]: E0313 12:13:17.400454 4786 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 13 12:13:17 crc kubenswrapper[4786]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 13 12:13:17 crc kubenswrapper[4786]: + source /usr/local/bin/container-scripts/functions Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNBridge=br-int Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNRemote=tcp:localhost:6642 Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNEncapType=geneve Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNAvailabilityZones= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ EnableChassisAsGateway=true Mar 13 12:13:17 crc kubenswrapper[4786]: ++ PhysicalNetworks= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ OVNHostName= Mar 13 12:13:17 crc kubenswrapper[4786]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 13 12:13:17 crc kubenswrapper[4786]: ++ ovs_dir=/var/lib/openvswitch Mar 13 12:13:17 crc kubenswrapper[4786]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 13 12:13:17 crc kubenswrapper[4786]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 13 12:13:17 crc kubenswrapper[4786]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + sleep 0.5 Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + sleep 0.5 Mar 13 12:13:17 crc kubenswrapper[4786]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 13 12:13:17 crc kubenswrapper[4786]: + cleanup_ovsdb_server_semaphore Mar 13 12:13:17 crc kubenswrapper[4786]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 13 12:13:17 crc kubenswrapper[4786]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 13 12:13:17 crc kubenswrapper[4786]: > pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" containerID="cri-o://7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.400489 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" containerID="cri-o://7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" gracePeriod=29 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.419140 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.426351 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rhxlw"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.482597 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07961f7a-7824-4e7d-b30a-e47699b2ca0f" path="/var/lib/kubelet/pods/07961f7a-7824-4e7d-b30a-e47699b2ca0f/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.483124 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb" path="/var/lib/kubelet/pods/108a37cf-a5a0-4ffd-b609-bb0bd4d28bfb/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.483593 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7" path="/var/lib/kubelet/pods/23e5b74f-ceb3-4eed-b34c-dac07ec2b3b7/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.488865 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af72684-e70e-4ff4-a72c-d4e830667645" path="/var/lib/kubelet/pods/3af72684-e70e-4ff4-a72c-d4e830667645/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.489450 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca1e192-7715-4bd6-b4b3-d6e6912b319c" path="/var/lib/kubelet/pods/3ca1e192-7715-4bd6-b4b3-d6e6912b319c/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.490021 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ce55b0-1082-49c8-b648-92425775ed24" path="/var/lib/kubelet/pods/42ce55b0-1082-49c8-b648-92425775ed24/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.491851 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd3434b-b358-4463-a081-511dd7a3469d" path="/var/lib/kubelet/pods/4fd3434b-b358-4463-a081-511dd7a3469d/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.492411 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6990d3ed-4503-4d9c-9f56-7b21a9abb203" path="/var/lib/kubelet/pods/6990d3ed-4503-4d9c-9f56-7b21a9abb203/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.492923 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0fe660-4646-4b25-b5b6-b24989d78be4" path="/var/lib/kubelet/pods/6d0fe660-4646-4b25-b5b6-b24989d78be4/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.534965 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edc8ac7-41c7-4051-aca8-9fc79e516a2b" path="/var/lib/kubelet/pods/9edc8ac7-41c7-4051-aca8-9fc79e516a2b/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.535804 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a594fa40-6352-480d-8927-c04bf51c9c51" path="/var/lib/kubelet/pods/a594fa40-6352-480d-8927-c04bf51c9c51/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.536511 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c2078a-f957-4d60-9a47-f7b0c7248b75" path="/var/lib/kubelet/pods/a5c2078a-f957-4d60-9a47-f7b0c7248b75/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.560963 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-nb\") pod \"8851cf9e-656d-439d-a0d8-a16bdc843d87\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.561063 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27kc\" (UniqueName: \"kubernetes.io/projected/8851cf9e-656d-439d-a0d8-a16bdc843d87-kube-api-access-t27kc\") pod \"8851cf9e-656d-439d-a0d8-a16bdc843d87\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.561194 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-svc\") pod \"8851cf9e-656d-439d-a0d8-a16bdc843d87\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.561267 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-config\") pod \"8851cf9e-656d-439d-a0d8-a16bdc843d87\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.561293 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-swift-storage-0\") pod \"8851cf9e-656d-439d-a0d8-a16bdc843d87\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.561342 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-sb\") pod \"8851cf9e-656d-439d-a0d8-a16bdc843d87\" (UID: \"8851cf9e-656d-439d-a0d8-a16bdc843d87\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.565485 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d192abbc-1942-4e41-8e85-4416d725ac32" path="/var/lib/kubelet/pods/d192abbc-1942-4e41-8e85-4416d725ac32/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.566300 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dd74a7-14cd-4d77-95a3-0d8c98edb870" path="/var/lib/kubelet/pods/d5dd74a7-14cd-4d77-95a3-0d8c98edb870/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.574617 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69f3ce2-7166-46f3-8381-987837e3383e" path="/var/lib/kubelet/pods/d69f3ce2-7166-46f3-8381-987837e3383e/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.596046 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e0cbbc8-b706-4a93-bd1b-442a68cce24b/ovsdbserver-sb/0.log" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.596117 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.597229 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerID="bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3" exitCode=143 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.610628 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef8e93b-aef1-4d5b-a40d-eaad723384cf" path="/var/lib/kubelet/pods/eef8e93b-aef1-4d5b-a40d-eaad723384cf/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.611733 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f70490bf-3f7e-4490-b045-dd095a1fdd16" path="/var/lib/kubelet/pods/f70490bf-3f7e-4490-b045-dd095a1fdd16/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.612339 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa010b2d-a4cf-4646-b289-54e0a6e285dd" path="/var/lib/kubelet/pods/fa010b2d-a4cf-4646-b289-54e0a6e285dd/volumes" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.614430 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8851cf9e-656d-439d-a0d8-a16bdc843d87-kube-api-access-t27kc" (OuterVolumeSpecName: "kube-api-access-t27kc") pod "8851cf9e-656d-439d-a0d8-a16bdc843d87" (UID: "8851cf9e-656d-439d-a0d8-a16bdc843d87"). InnerVolumeSpecName "kube-api-access-t27kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.635699 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mpvpt_5e05d002-d224-4a13-8497-fc49712f7084/openstack-network-exporter/0.log" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.635742 4786 generic.go:334] "Generic (PLEG): container finished" podID="5e05d002-d224-4a13-8497-fc49712f7084" containerID="e7effd0a662e6015905b4c8a787a7e2eab9bffc913dcf3ece7c373f8bf960414" exitCode=2 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640240 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e9745df-949d-443d-93bb-0e5b3692ccd6","Type":"ContainerDied","Data":"bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mpvpt" event={"ID":"5e05d002-d224-4a13-8497-fc49712f7084","Type":"ContainerDied","Data":"e7effd0a662e6015905b4c8a787a7e2eab9bffc913dcf3ece7c373f8bf960414"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640291 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rhxlw"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640309 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640321 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-dbb48765-fzcqd"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640667 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-dbb48765-fzcqd" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-httpd" containerID="cri-o://bd18ddf3196c9bf5a4a9b83508e122c58fe4f289252ce5dccbca45f28bb401b8" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.640862 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-dbb48765-fzcqd" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-server" containerID="cri-o://6faefcc8f8f08a959c2efe031b6171c7238793dbc61001559ea27795b9e169c2" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.667634 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdbserver-sb-tls-certs\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.667914 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-combined-ca-bundle\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.667975 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj7w6\" (UniqueName: \"kubernetes.io/projected/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-kube-api-access-dj7w6\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668014 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-scripts\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668050 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668071 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-metrics-certs-tls-certs\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668115 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-config\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668170 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdb-rundir\") pod \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\" (UID: \"7e0cbbc8-b706-4a93-bd1b-442a68cce24b\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668253 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerID="9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410" exitCode=143 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668335 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aa75843b-0c7d-49c1-be09-bef85ec8fd16","Type":"ContainerDied","Data":"9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.668591 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27kc\" (UniqueName: \"kubernetes.io/projected/8851cf9e-656d-439d-a0d8-a16bdc843d87-kube-api-access-t27kc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.674544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-config" (OuterVolumeSpecName: "config") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.674680 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-scripts" (OuterVolumeSpecName: "scripts") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.684373 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerID="f34dd912eb47d002fd56518d38540c1994f7c17513d2933e712f79bc0fca64c8" exitCode=143 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.684652 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88c9bb894-zzsvv" event={"ID":"b0d491ad-ee68-47bb-a1e3-66d22ecca41a","Type":"ContainerDied","Data":"f34dd912eb47d002fd56518d38540c1994f7c17513d2933e712f79bc0fca64c8"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.686079 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.687494 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.687557 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-lvhh7"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.698722 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-lvhh7"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.708319 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-config" (OuterVolumeSpecName: "config") pod "8851cf9e-656d-439d-a0d8-a16bdc843d87" (UID: "8851cf9e-656d-439d-a0d8-a16bdc843d87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.709264 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8851cf9e-656d-439d-a0d8-a16bdc843d87" (UID: "8851cf9e-656d-439d-a0d8-a16bdc843d87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.714439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-kube-api-access-dj7w6" (OuterVolumeSpecName: "kube-api-access-dj7w6") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "kube-api-access-dj7w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.724399 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3078-account-create-update-mxjs7"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.760994 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.762024 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-log" containerID="cri-o://bde1fb753ad4d23d69552efc3946e3c4ff275d991136e6e6fc724f42a4350c75" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.762296 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-api" containerID="cri-o://8d638dd580ac1c508eca3ff370e5d3dd8062fb913eb5b6a7194fb27153cf2701" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774407 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774442 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj7w6\" (UniqueName: \"kubernetes.io/projected/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-kube-api-access-dj7w6\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774456 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774486 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774498 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774509 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.774519 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.784751 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mpvpt_5e05d002-d224-4a13-8497-fc49712f7084/openstack-network-exporter/0.log" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.784810 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.807829 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.808025 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.808080 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.808128 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.809044 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.809106 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.809126 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.809139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.809151 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.814665 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-q9frn"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816531 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816630 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816701 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816773 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816836 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816902 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.816973 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817025 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817336 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817423 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817490 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.817546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.846529 4786 generic.go:334] "Generic (PLEG): container finished" podID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerID="8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2" exitCode=143 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.846595 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1a06f1e9-ddda-42a5-ab33-88473c56a6c7","Type":"ContainerDied","Data":"8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.861471 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mfpdw"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.876556 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovs-rundir\") pod \"5e05d002-d224-4a13-8497-fc49712f7084\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.876618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovn-rundir\") pod \"5e05d002-d224-4a13-8497-fc49712f7084\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.876699 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e05d002-d224-4a13-8497-fc49712f7084-config\") pod \"5e05d002-d224-4a13-8497-fc49712f7084\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.876717 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-combined-ca-bundle\") pod \"5e05d002-d224-4a13-8497-fc49712f7084\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.876778 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v5rl\" (UniqueName: \"kubernetes.io/projected/5e05d002-d224-4a13-8497-fc49712f7084-kube-api-access-9v5rl\") pod \"5e05d002-d224-4a13-8497-fc49712f7084\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.876815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-metrics-certs-tls-certs\") pod \"5e05d002-d224-4a13-8497-fc49712f7084\" (UID: \"5e05d002-d224-4a13-8497-fc49712f7084\") " Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.878238 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e05d002-d224-4a13-8497-fc49712f7084-config" (OuterVolumeSpecName: "config") pod "5e05d002-d224-4a13-8497-fc49712f7084" (UID: "5e05d002-d224-4a13-8497-fc49712f7084"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.878350 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "5e05d002-d224-4a13-8497-fc49712f7084" (UID: "5e05d002-d224-4a13-8497-fc49712f7084"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.878431 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "5e05d002-d224-4a13-8497-fc49712f7084" (UID: "5e05d002-d224-4a13-8497-fc49712f7084"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.884161 4786 generic.go:334] "Generic (PLEG): container finished" podID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" exitCode=0 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.884254 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerDied","Data":"7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.896078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69ffc749-26dmp" event={"ID":"8851cf9e-656d-439d-a0d8-a16bdc843d87","Type":"ContainerDied","Data":"19c390b8d90e14fda47c66027733e1779bad7b3e6676beda27b49c7149f50586"} Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.896292 4786 scope.go:117] "RemoveContainer" containerID="894cb93b5b88c9cda5163ee38d365a1d2b6f2902380b8a636a78fe6a94bbf669" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.896505 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69ffc749-26dmp" Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.953870 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-dbb857556-g9c6x"] Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.954241 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener-log" containerID="cri-o://d022ca1a7d8f88f231dbd73accdaf4bd33c41433e29373fd3195771df609a146" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.963242 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener" containerID="cri-o://54b263afbc6e8053c748bc362cfd1bd6dfe5af3c222dd7810fe8d319eb6c30f9" gracePeriod=30 Mar 13 12:13:17 crc kubenswrapper[4786]: I0313 12:13:17.969331 4786 generic.go:334] "Generic (PLEG): container finished" podID="c9cf93cd-d636-4947-8318-0fade89f65d7" containerID="2196dbffe1af6cbceffdff832f9438466b94634b242cc388d431fdf75e8ded98" exitCode=137 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.011623 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7569c6d56c-2c7lj"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.017317 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7569c6d56c-2c7lj" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker-log" containerID="cri-o://7543fdc8467c20ef9ae263084f0987465a93b904cd4f492afe5d1897a9e24ab1" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.013362 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e05d002-d224-4a13-8497-fc49712f7084-kube-api-access-9v5rl" (OuterVolumeSpecName: "kube-api-access-9v5rl") pod "5e05d002-d224-4a13-8497-fc49712f7084" (UID: "5e05d002-d224-4a13-8497-fc49712f7084"). InnerVolumeSpecName "kube-api-access-9v5rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.017658 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7569c6d56c-2c7lj" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker" containerID="cri-o://e0a352ffff2ccb7dffda5744bd27fe06dce994d483616356c61b3ccede71f2c0" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.030860 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mfpdw"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.043231 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e0cbbc8-b706-4a93-bd1b-442a68cce24b/ovsdbserver-sb/0.log" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.043453 4786 generic.go:334] "Generic (PLEG): container finished" podID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerID="cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f" exitCode=143 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.043877 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="cinder-scheduler" containerID="cri-o://72ec4a750cf5f8f8444ba20cf0c7ee683c4b7001b32595a47908763487ea5853" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.044433 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.044669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e0cbbc8-b706-4a93-bd1b-442a68cce24b","Type":"ContainerDied","Data":"cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f"} Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.044770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e0cbbc8-b706-4a93-bd1b-442a68cce24b","Type":"ContainerDied","Data":"d62cac7c78b0b8b9b878871534cc2e4db451db4a8e9e1afed2f546809e636f0e"} Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.045250 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="probe" containerID="cri-o://f8c4849551e632909c931cb9c230d8766911210a5e2a92f2d4a1214b86907ca7" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.052262 4786 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.052937 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e05d002-d224-4a13-8497-fc49712f7084-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.052982 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e05d002-d224-4a13-8497-fc49712f7084-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.052997 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v5rl\" (UniqueName: \"kubernetes.io/projected/5e05d002-d224-4a13-8497-fc49712f7084-kube-api-access-9v5rl\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.053026 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d9fb9c86-4lc8x"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.053375 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d9fb9c86-4lc8x" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api-log" containerID="cri-o://09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.053553 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d9fb9c86-4lc8x" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api" containerID="cri-o://2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.084090 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-j474k"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.097616 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-q9frn"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.129389 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.153140 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-j474k"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.155153 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.155349 4786 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.155491 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:20.155472483 +0000 UTC m=+1587.435125930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scripts" not found Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.155622 4786 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.155724 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:20.15571094 +0000 UTC m=+1587.435364387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-config-data" not found Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.155856 4786 secret.go:188] Couldn't get secret openstack/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.155967 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:20.155954936 +0000 UTC m=+1587.435608383 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scheduler-config-data" not found Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.176236 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.176664 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cbebf1d2-7723-4d09-85de-a7e630caad3b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f743b475f641e7e3d24360e430567f8d2e636c83bf391096a2497df4fab645b6" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.176686 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.185113 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hncnh"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.209154 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-dbb48765-fzcqd" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.176:8080/healthcheck\": read tcp 10.217.0.2:54908->10.217.0.176:8080: read: connection reset by peer" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.209183 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-dbb48765-fzcqd" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.176:8080/healthcheck\": read tcp 10.217.0.2:54906->10.217.0.176:8080: read: connection reset by peer" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.216194 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.217051 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.236352 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.236406 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.241220 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.261537 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.270896 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlf8"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.277029 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.279192 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlf8"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.295464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e05d002-d224-4a13-8497-fc49712f7084" (UID: "5e05d002-d224-4a13-8497-fc49712f7084"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.297687 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jqvv"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.304514 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.304759 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.312021 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jqvv"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.332487 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.332780 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="116541e7-d92f-48ff-ad78-7dba2f45fc18" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.373497 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" probeResult="failure" output=< Mar 13 12:13:18 crc kubenswrapper[4786]: cat: /var/run/openvswitch/ovs-vswitchd.pid: No such file or directory Mar 13 12:13:18 crc kubenswrapper[4786]: ERROR - Failed to get pid for ovs-vswitchd, exit status: 0 Mar 13 12:13:18 crc kubenswrapper[4786]: > Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.379101 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.413916 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8851cf9e-656d-439d-a0d8-a16bdc843d87" (UID: "8851cf9e-656d-439d-a0d8-a16bdc843d87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.414684 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8851cf9e-656d-439d-a0d8-a16bdc843d87" (UID: "8851cf9e-656d-439d-a0d8-a16bdc843d87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.422756 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.430835 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.431738 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" containerName="nova-scheduler-scheduler" containerID="cri-o://baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.433122 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.441024 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.441093 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerName="nova-cell1-conductor-conductor" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.489691 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.489814 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.500135 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.507002 4786 scope.go:117] "RemoveContainer" containerID="2fe2f63b3058a5332c6e4555092190fe27dedba1ec7e07e481f9f8d9908c7412" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.523056 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8851cf9e-656d-439d-a0d8-a16bdc843d87" (UID: "8851cf9e-656d-439d-a0d8-a16bdc843d87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.524722 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerName="galera" containerID="cri-o://ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7" gracePeriod=30 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.551570 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a37be46-7b90-4c56-8dcf-a3ea45123df8/ovsdbserver-nb/0.log" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.551664 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.585172 4786 scope.go:117] "RemoveContainer" containerID="6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.585635 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" containerID="cri-o://57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" gracePeriod=28 Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.591575 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8851cf9e-656d-439d-a0d8-a16bdc843d87-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.595136 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hncnh"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.616707 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3078-account-create-update-mxjs7"] Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.633339 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "5e05d002-d224-4a13-8497-fc49712f7084" (UID: "5e05d002-d224-4a13-8497-fc49712f7084"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.637940 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 12:13:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: if [ -n "" ]; then Mar 13 12:13:18 crc kubenswrapper[4786]: GRANT_DATABASE="" Mar 13 12:13:18 crc kubenswrapper[4786]: else Mar 13 12:13:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 12:13:18 crc kubenswrapper[4786]: fi Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 12:13:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 12:13:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 12:13:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 12:13:18 crc kubenswrapper[4786]: # support updates Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.639951 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-hncnh" podUID="7effb60f-4f63-48d0-8b3e-1792e39c79d5" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.664562 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7e0cbbc8-b706-4a93-bd1b-442a68cce24b" (UID: "7e0cbbc8-b706-4a93-bd1b-442a68cce24b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.671621 4786 scope.go:117] "RemoveContainer" containerID="cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.672052 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 12:13:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: if [ -n "barbican" ]; then Mar 13 12:13:18 crc kubenswrapper[4786]: GRANT_DATABASE="barbican" Mar 13 12:13:18 crc kubenswrapper[4786]: else Mar 13 12:13:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 12:13:18 crc kubenswrapper[4786]: fi Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 12:13:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 12:13:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 12:13:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 12:13:18 crc kubenswrapper[4786]: # support updates Mar 13 12:13:18 crc kubenswrapper[4786]: Mar 13 12:13:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.675956 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-3078-account-create-update-mxjs7" podUID="530d22ba-9371-4850-8c78-26323a26ad06" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.696804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-scripts\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.696907 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqtc5\" (UniqueName: \"kubernetes.io/projected/3a37be46-7b90-4c56-8dcf-a3ea45123df8-kube-api-access-dqtc5\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.697784 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-combined-ca-bundle\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.697840 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-combined-ca-bundle\") pod \"c9cf93cd-d636-4947-8318-0fade89f65d7\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.697901 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.697981 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config-secret\") pod \"c9cf93cd-d636-4947-8318-0fade89f65d7\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.698019 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdbserver-nb-tls-certs\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.698161 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srfh\" (UniqueName: \"kubernetes.io/projected/c9cf93cd-d636-4947-8318-0fade89f65d7-kube-api-access-2srfh\") pod \"c9cf93cd-d636-4947-8318-0fade89f65d7\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.698240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-metrics-certs-tls-certs\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.698279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdb-rundir\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.698332 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-config\") pod \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\" (UID: \"3a37be46-7b90-4c56-8dcf-a3ea45123df8\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.698393 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config\") pod \"c9cf93cd-d636-4947-8318-0fade89f65d7\" (UID: \"c9cf93cd-d636-4947-8318-0fade89f65d7\") " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.699313 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0cbbc8-b706-4a93-bd1b-442a68cce24b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.699331 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e05d002-d224-4a13-8497-fc49712f7084-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.708834 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-scripts" (OuterVolumeSpecName: "scripts") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.709972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.711739 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-config" (OuterVolumeSpecName: "config") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.733121 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.733202 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cf93cd-d636-4947-8318-0fade89f65d7-kube-api-access-2srfh" (OuterVolumeSpecName: "kube-api-access-2srfh") pod "c9cf93cd-d636-4947-8318-0fade89f65d7" (UID: "c9cf93cd-d636-4947-8318-0fade89f65d7"). InnerVolumeSpecName "kube-api-access-2srfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.733358 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a37be46-7b90-4c56-8dcf-a3ea45123df8-kube-api-access-dqtc5" (OuterVolumeSpecName: "kube-api-access-dqtc5") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "kube-api-access-dqtc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.733949 4786 scope.go:117] "RemoveContainer" containerID="6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.741765 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b\": container with ID starting with 6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b not found: ID does not exist" containerID="6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.741959 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b"} err="failed to get container status \"6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b\": rpc error: code = NotFound desc = could not find container \"6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b\": container with ID starting with 6ec8bf4c9908ce9e2a042ba61d4f6d80cb8e26b840336c05dc94494f8573409b not found: ID does not exist" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.742100 4786 scope.go:117] "RemoveContainer" containerID="cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f" Mar 13 12:13:18 crc kubenswrapper[4786]: E0313 12:13:18.751088 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f\": container with ID starting with cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f not found: ID does not exist" containerID="cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.751148 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f"} err="failed to get container status \"cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f\": rpc error: code = NotFound desc = could not find container \"cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f\": container with ID starting with cb8adb8ca5ee0d6a352b2e4102a986a25ffb412972ec164044bf65a579f97f1f not found: ID does not exist" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.765888 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c9cf93cd-d636-4947-8318-0fade89f65d7" (UID: "c9cf93cd-d636-4947-8318-0fade89f65d7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.789296 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9cf93cd-d636-4947-8318-0fade89f65d7" (UID: "c9cf93cd-d636-4947-8318-0fade89f65d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802571 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2srfh\" (UniqueName: \"kubernetes.io/projected/c9cf93cd-d636-4947-8318-0fade89f65d7-kube-api-access-2srfh\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802620 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802636 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802648 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802658 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a37be46-7b90-4c56-8dcf-a3ea45123df8-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802692 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqtc5\" (UniqueName: \"kubernetes.io/projected/3a37be46-7b90-4c56-8dcf-a3ea45123df8-kube-api-access-dqtc5\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802704 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.802741 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.847835 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.916269 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:18 crc kubenswrapper[4786]: I0313 12:13:18.941972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.012552 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.017659 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.017714 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.056734 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c9cf93cd-d636-4947-8318-0fade89f65d7" (UID: "c9cf93cd-d636-4947-8318-0fade89f65d7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.073551 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerID="e0a352ffff2ccb7dffda5744bd27fe06dce994d483616356c61b3ccede71f2c0" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.073577 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerID="7543fdc8467c20ef9ae263084f0987465a93b904cd4f492afe5d1897a9e24ab1" exitCode=143 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.073613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7569c6d56c-2c7lj" event={"ID":"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5","Type":"ContainerDied","Data":"e0a352ffff2ccb7dffda5744bd27fe06dce994d483616356c61b3ccede71f2c0"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.073640 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7569c6d56c-2c7lj" event={"ID":"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5","Type":"ContainerDied","Data":"7543fdc8467c20ef9ae263084f0987465a93b904cd4f492afe5d1897a9e24ab1"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.085484 4786 generic.go:334] "Generic (PLEG): container finished" podID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerID="bde1fb753ad4d23d69552efc3946e3c4ff275d991136e6e6fc724f42a4350c75" exitCode=143 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.085839 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67ad69d-5191-4d93-9326-b93b0653a82c","Type":"ContainerDied","Data":"bde1fb753ad4d23d69552efc3946e3c4ff275d991136e6e6fc724f42a4350c75"} Mar 13 12:13:19 crc kubenswrapper[4786]: E0313 12:13:19.104303 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.117540 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3a37be46-7b90-4c56-8dcf-a3ea45123df8" (UID: "3a37be46-7b90-4c56-8dcf-a3ea45123df8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.119996 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9cf93cd-d636-4947-8318-0fade89f65d7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.120022 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a37be46-7b90-4c56-8dcf-a3ea45123df8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.139331 4786 generic.go:334] "Generic (PLEG): container finished" podID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerID="617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.139407 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6577bdf497-p2bmr" event={"ID":"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c","Type":"ContainerDied","Data":"617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd"} Mar 13 12:13:19 crc kubenswrapper[4786]: E0313 12:13:19.141674 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:19 crc kubenswrapper[4786]: E0313 12:13:19.143416 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3 is running failed: container process not found" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:19 crc kubenswrapper[4786]: E0313 12:13:19.143460 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="116541e7-d92f-48ff-ad78-7dba2f45fc18" containerName="nova-cell0-conductor-conductor" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.147963 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3078-account-create-update-mxjs7" event={"ID":"530d22ba-9371-4850-8c78-26323a26ad06","Type":"ContainerStarted","Data":"c42f7c509b9671f1a4efce9722ac04c0147dacaa61e879836be5c220fecc44a0"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.190128 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.190167 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.190241 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.190339 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.197654 4786 generic.go:334] "Generic (PLEG): container finished" podID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerID="b9aec14b391a1bbbd8f466a3df6625873e9ec6de58fe63728da3a16855652999" exitCode=143 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.197752 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ab23f85-03a5-4df3-bfa8-da6f748f44e3","Type":"ContainerDied","Data":"b9aec14b391a1bbbd8f466a3df6625873e9ec6de58fe63728da3a16855652999"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.207047 4786 generic.go:334] "Generic (PLEG): container finished" podID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerID="6faefcc8f8f08a959c2efe031b6171c7238793dbc61001559ea27795b9e169c2" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.207082 4786 generic.go:334] "Generic (PLEG): container finished" podID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerID="bd18ddf3196c9bf5a4a9b83508e122c58fe4f289252ce5dccbca45f28bb401b8" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.207156 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dbb48765-fzcqd" event={"ID":"129e2d9e-bcc5-4fb2-815c-29d99648b1f3","Type":"ContainerDied","Data":"6faefcc8f8f08a959c2efe031b6171c7238793dbc61001559ea27795b9e169c2"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.207258 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dbb48765-fzcqd" event={"ID":"129e2d9e-bcc5-4fb2-815c-29d99648b1f3","Type":"ContainerDied","Data":"bd18ddf3196c9bf5a4a9b83508e122c58fe4f289252ce5dccbca45f28bb401b8"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.207273 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dbb48765-fzcqd" event={"ID":"129e2d9e-bcc5-4fb2-815c-29d99648b1f3","Type":"ContainerDied","Data":"352ae26b5bd48de42e619f7d26f8c71ff20bdba4e51f52483335312f2ce61310"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.207285 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="352ae26b5bd48de42e619f7d26f8c71ff20bdba4e51f52483335312f2ce61310" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.210440 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mpvpt_5e05d002-d224-4a13-8497-fc49712f7084/openstack-network-exporter/0.log" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.210590 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mpvpt" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.210690 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mpvpt" event={"ID":"5e05d002-d224-4a13-8497-fc49712f7084","Type":"ContainerDied","Data":"01a15da7ad3738f7bd7a017441463cf4c03ff825c19e9088268a352896105eb2"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.210729 4786 scope.go:117] "RemoveContainer" containerID="e7effd0a662e6015905b4c8a787a7e2eab9bffc913dcf3ece7c373f8bf960414" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.217711 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.221896 4786 generic.go:334] "Generic (PLEG): container finished" podID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerID="09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b" exitCode=143 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.222012 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fb9c86-4lc8x" event={"ID":"124c632a-4ff3-419c-9e26-ba68929feeb7","Type":"ContainerDied","Data":"09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.229572 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a37be46-7b90-4c56-8dcf-a3ea45123df8/ovsdbserver-nb/0.log" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.229707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a37be46-7b90-4c56-8dcf-a3ea45123df8","Type":"ContainerDied","Data":"324e06768cec01cf06404efb519d916ff9eb0647dda54f75c85a98192979ac4b"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.229742 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.241088 4786 generic.go:334] "Generic (PLEG): container finished" podID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerID="54b263afbc6e8053c748bc362cfd1bd6dfe5af3c222dd7810fe8d319eb6c30f9" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.241116 4786 generic.go:334] "Generic (PLEG): container finished" podID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerID="d022ca1a7d8f88f231dbd73accdaf4bd33c41433e29373fd3195771df609a146" exitCode=143 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.241173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" event={"ID":"14222e06-64a4-424f-9b69-cb6d2b62c001","Type":"ContainerDied","Data":"54b263afbc6e8053c748bc362cfd1bd6dfe5af3c222dd7810fe8d319eb6c30f9"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.241205 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" event={"ID":"14222e06-64a4-424f-9b69-cb6d2b62c001","Type":"ContainerDied","Data":"d022ca1a7d8f88f231dbd73accdaf4bd33c41433e29373fd3195771df609a146"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.258723 4786 generic.go:334] "Generic (PLEG): container finished" podID="cbebf1d2-7723-4d09-85de-a7e630caad3b" containerID="f743b475f641e7e3d24360e430567f8d2e636c83bf391096a2497df4fab645b6" exitCode=0 Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.258787 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cbebf1d2-7723-4d09-85de-a7e630caad3b","Type":"ContainerDied","Data":"f743b475f641e7e3d24360e430567f8d2e636c83bf391096a2497df4fab645b6"} Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.262987 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hncnh" event={"ID":"7effb60f-4f63-48d0-8b3e-1792e39c79d5","Type":"ContainerStarted","Data":"53895c4cc1d98550390d009ab9248208c4b0b4f27286b2879baa68249b7aa3fe"} Mar 13 12:13:19 crc kubenswrapper[4786]: E0313 12:13:19.322343 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 13 12:13:19 crc kubenswrapper[4786]: E0313 12:13:19.322409 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data podName:53fea24b-7ca8-4c0a-96d1-458ca1e877a7 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:23.322395786 +0000 UTC m=+1590.602049233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data") pod "rabbitmq-server-0" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7") : configmap "rabbitmq-config-data" not found Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.420518 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.426095 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-etc-swift\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.426141 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-combined-ca-bundle\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.426208 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf6nq\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-kube-api-access-rf6nq\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.426240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-config-data\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.427854 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-run-httpd\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.427903 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-log-httpd\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.428995 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.431726 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.432206 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-kube-api-access-rf6nq" (OuterVolumeSpecName: "kube-api-access-rf6nq") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "kube-api-access-rf6nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.446514 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.455969 4786 scope.go:117] "RemoveContainer" containerID="2196dbffe1af6cbceffdff832f9438466b94634b242cc388d431fdf75e8ded98" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.470654 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085820e1-a384-4656-8200-bb5ae71491ae" path="/var/lib/kubelet/pods/085820e1-a384-4656-8200-bb5ae71491ae/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.471364 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d674fa-8483-4cba-a0ad-49ebd1f68558" path="/var/lib/kubelet/pods/49d674fa-8483-4cba-a0ad-49ebd1f68558/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.472788 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804557dd-c3fc-4502-8b3f-4bcabfb93688" path="/var/lib/kubelet/pods/804557dd-c3fc-4502-8b3f-4bcabfb93688/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.474107 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8909d231-1928-4f63-b383-856cb26fa4a2" path="/var/lib/kubelet/pods/8909d231-1928-4f63-b383-856cb26fa4a2/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.476212 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a17ac7e3-0ac7-480c-9909-b4c3cc76696b" path="/var/lib/kubelet/pods/a17ac7e3-0ac7-480c-9909-b4c3cc76696b/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.476726 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befdb8e3-7615-4bd4-a6f8-dfa11bd924bd" path="/var/lib/kubelet/pods/befdb8e3-7615-4bd4-a6f8-dfa11bd924bd/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.478432 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cf93cd-d636-4947-8318-0fade89f65d7" path="/var/lib/kubelet/pods/c9cf93cd-d636-4947-8318-0fade89f65d7/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.480099 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf" path="/var/lib/kubelet/pods/d9b7dba0-2c8f-4032-9c81-6eb9b0c69adf/volumes" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.487798 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.487836 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.487851 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-mpvpt"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.487865 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-mpvpt"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.498653 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.499491 4786 scope.go:117] "RemoveContainer" containerID="75127e67378a2a6d7c4e145c4a096adfcfcfcab00c002649d437ac56addedfda" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.517278 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.532059 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-internal-tls-certs\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.532142 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.533476 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.533526 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-26dmp"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.535120 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.535169 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf6nq\" (UniqueName: \"kubernetes.io/projected/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-kube-api-access-rf6nq\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.535183 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.535190 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.571475 4786 scope.go:117] "RemoveContainer" containerID="9c3675158323b3cf247da0933d56c27d29b953868a19dce3603f8acbc417ab01" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.595844 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69ffc749-26dmp"] Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.606263 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.640358 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-config-data" (OuterVolumeSpecName: "config-data") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.647938 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.647972 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.686999 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.705774 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.718780 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.739175 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750053 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750273 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data\") pod \"14222e06-64a4-424f-9b69-cb6d2b62c001\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750436 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530d22ba-9371-4850-8c78-26323a26ad06-operator-scripts\") pod \"530d22ba-9371-4850-8c78-26323a26ad06\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750553 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-logs\") pod \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750662 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data\") pod \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750777 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data-custom\") pod \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.750941 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-combined-ca-bundle\") pod \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751050 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xhrc\" (UniqueName: \"kubernetes.io/projected/cbebf1d2-7723-4d09-85de-a7e630caad3b-kube-api-access-4xhrc\") pod \"cbebf1d2-7723-4d09-85de-a7e630caad3b\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751133 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-nova-novncproxy-tls-certs\") pod \"cbebf1d2-7723-4d09-85de-a7e630caad3b\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751232 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqs7z\" (UniqueName: \"kubernetes.io/projected/530d22ba-9371-4850-8c78-26323a26ad06-kube-api-access-pqs7z\") pod \"530d22ba-9371-4850-8c78-26323a26ad06\" (UID: \"530d22ba-9371-4850-8c78-26323a26ad06\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751419 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14222e06-64a4-424f-9b69-cb6d2b62c001-logs\") pod \"14222e06-64a4-424f-9b69-cb6d2b62c001\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751496 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data-custom\") pod \"14222e06-64a4-424f-9b69-cb6d2b62c001\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751566 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-combined-ca-bundle\") pod \"14222e06-64a4-424f-9b69-cb6d2b62c001\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751643 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-config-data\") pod \"cbebf1d2-7723-4d09-85de-a7e630caad3b\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751721 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b5lb\" (UniqueName: \"kubernetes.io/projected/14222e06-64a4-424f-9b69-cb6d2b62c001-kube-api-access-2b5lb\") pod \"14222e06-64a4-424f-9b69-cb6d2b62c001\" (UID: \"14222e06-64a4-424f-9b69-cb6d2b62c001\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.751835 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs\") pod \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\" (UID: \"129e2d9e-bcc5-4fb2-815c-29d99648b1f3\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.753585 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-combined-ca-bundle\") pod \"cbebf1d2-7723-4d09-85de-a7e630caad3b\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.753678 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqzff\" (UniqueName: \"kubernetes.io/projected/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-kube-api-access-vqzff\") pod \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\" (UID: \"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.753779 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-vencrypt-tls-certs\") pod \"cbebf1d2-7723-4d09-85de-a7e630caad3b\" (UID: \"cbebf1d2-7723-4d09-85de-a7e630caad3b\") " Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.754427 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: W0313 12:13:19.753374 4786 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/129e2d9e-bcc5-4fb2-815c-29d99648b1f3/volumes/kubernetes.io~secret/public-tls-certs Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.756022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "129e2d9e-bcc5-4fb2-815c-29d99648b1f3" (UID: "129e2d9e-bcc5-4fb2-815c-29d99648b1f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.764251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/530d22ba-9371-4850-8c78-26323a26ad06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "530d22ba-9371-4850-8c78-26323a26ad06" (UID: "530d22ba-9371-4850-8c78-26323a26ad06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.764795 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-logs" (OuterVolumeSpecName: "logs") pod "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" (UID: "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.769337 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14222e06-64a4-424f-9b69-cb6d2b62c001-logs" (OuterVolumeSpecName: "logs") pod "14222e06-64a4-424f-9b69-cb6d2b62c001" (UID: "14222e06-64a4-424f-9b69-cb6d2b62c001"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.779163 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbebf1d2-7723-4d09-85de-a7e630caad3b-kube-api-access-4xhrc" (OuterVolumeSpecName: "kube-api-access-4xhrc") pod "cbebf1d2-7723-4d09-85de-a7e630caad3b" (UID: "cbebf1d2-7723-4d09-85de-a7e630caad3b"). InnerVolumeSpecName "kube-api-access-4xhrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.781020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14222e06-64a4-424f-9b69-cb6d2b62c001-kube-api-access-2b5lb" (OuterVolumeSpecName: "kube-api-access-2b5lb") pod "14222e06-64a4-424f-9b69-cb6d2b62c001" (UID: "14222e06-64a4-424f-9b69-cb6d2b62c001"). InnerVolumeSpecName "kube-api-access-2b5lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.783991 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "14222e06-64a4-424f-9b69-cb6d2b62c001" (UID: "14222e06-64a4-424f-9b69-cb6d2b62c001"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.794035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" (UID: "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.794090 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/530d22ba-9371-4850-8c78-26323a26ad06-kube-api-access-pqs7z" (OuterVolumeSpecName: "kube-api-access-pqs7z") pod "530d22ba-9371-4850-8c78-26323a26ad06" (UID: "530d22ba-9371-4850-8c78-26323a26ad06"). InnerVolumeSpecName "kube-api-access-pqs7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.809125 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-kube-api-access-vqzff" (OuterVolumeSpecName: "kube-api-access-vqzff") pod "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" (UID: "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5"). InnerVolumeSpecName "kube-api-access-vqzff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.859647 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqs7z\" (UniqueName: \"kubernetes.io/projected/530d22ba-9371-4850-8c78-26323a26ad06-kube-api-access-pqs7z\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.859680 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14222e06-64a4-424f-9b69-cb6d2b62c001-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.859693 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.862949 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b5lb\" (UniqueName: \"kubernetes.io/projected/14222e06-64a4-424f-9b69-cb6d2b62c001-kube-api-access-2b5lb\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.862988 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/129e2d9e-bcc5-4fb2-815c-29d99648b1f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.863002 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqzff\" (UniqueName: \"kubernetes.io/projected/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-kube-api-access-vqzff\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.863028 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530d22ba-9371-4850-8c78-26323a26ad06-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.863041 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.863054 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.863066 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xhrc\" (UniqueName: \"kubernetes.io/projected/cbebf1d2-7723-4d09-85de-a7e630caad3b-kube-api-access-4xhrc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.886729 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-config-data" (OuterVolumeSpecName: "config-data") pod "cbebf1d2-7723-4d09-85de-a7e630caad3b" (UID: "cbebf1d2-7723-4d09-85de-a7e630caad3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.905999 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" (UID: "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.928536 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14222e06-64a4-424f-9b69-cb6d2b62c001" (UID: "14222e06-64a4-424f-9b69-cb6d2b62c001"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.966990 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data" (OuterVolumeSpecName: "config-data") pod "14222e06-64a4-424f-9b69-cb6d2b62c001" (UID: "14222e06-64a4-424f-9b69-cb6d2b62c001"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.969119 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "cbebf1d2-7723-4d09-85de-a7e630caad3b" (UID: "cbebf1d2-7723-4d09-85de-a7e630caad3b"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.975076 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbebf1d2-7723-4d09-85de-a7e630caad3b" (UID: "cbebf1d2-7723-4d09-85de-a7e630caad3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.977991 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.978106 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.978151 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.978167 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14222e06-64a4-424f-9b69-cb6d2b62c001-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.978181 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.978192 4786 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:19 crc kubenswrapper[4786]: I0313 12:13:19.988459 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "cbebf1d2-7723-4d09-85de-a7e630caad3b" (UID: "cbebf1d2-7723-4d09-85de-a7e630caad3b"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.005215 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data" (OuterVolumeSpecName: "config-data") pod "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" (UID: "4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.086913 4786 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbebf1d2-7723-4d09-85de-a7e630caad3b-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.086952 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.094054 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.102576 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.120045 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188471 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-combined-ca-bundle\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188513 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188545 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-config-data\") pod \"116541e7-d92f-48ff-ad78-7dba2f45fc18\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188571 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-generated\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbgn4\" (UniqueName: \"kubernetes.io/projected/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kube-api-access-kbgn4\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188634 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q7k2\" (UniqueName: \"kubernetes.io/projected/7effb60f-4f63-48d0-8b3e-1792e39c79d5-kube-api-access-9q7k2\") pod \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7effb60f-4f63-48d0-8b3e-1792e39c79d5-operator-scripts\") pod \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\" (UID: \"7effb60f-4f63-48d0-8b3e-1792e39c79d5\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188721 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8h2f\" (UniqueName: \"kubernetes.io/projected/116541e7-d92f-48ff-ad78-7dba2f45fc18-kube-api-access-g8h2f\") pod \"116541e7-d92f-48ff-ad78-7dba2f45fc18\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-galera-tls-certs\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-default\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188874 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kolla-config\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188915 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-combined-ca-bundle\") pod \"116541e7-d92f-48ff-ad78-7dba2f45fc18\" (UID: \"116541e7-d92f-48ff-ad78-7dba2f45fc18\") " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.188997 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-operator-scripts\") pod \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\" (UID: \"8b28a544-1e5f-46f0-a6d9-7a147c5d737e\") " Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.189562 4786 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.189629 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:24.189609616 +0000 UTC m=+1591.469263073 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-config-data" not found Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.194446 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.194481 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.194960 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.196770 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7effb60f-4f63-48d0-8b3e-1792e39c79d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7effb60f-4f63-48d0-8b3e-1792e39c79d5" (UID: "7effb60f-4f63-48d0-8b3e-1792e39c79d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.197298 4786 secret.go:188] Couldn't get secret openstack/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.197351 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:24.197332906 +0000 UTC m=+1591.476986353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scheduler-config-data" not found Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.204117 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.205219 4786 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.205305 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:24.205285402 +0000 UTC m=+1591.484938929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scripts" not found Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.215736 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kube-api-access-kbgn4" (OuterVolumeSpecName: "kube-api-access-kbgn4") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "kube-api-access-kbgn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.215838 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7effb60f-4f63-48d0-8b3e-1792e39c79d5-kube-api-access-9q7k2" (OuterVolumeSpecName: "kube-api-access-9q7k2") pod "7effb60f-4f63-48d0-8b3e-1792e39c79d5" (UID: "7effb60f-4f63-48d0-8b3e-1792e39c79d5"). InnerVolumeSpecName "kube-api-access-9q7k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.222474 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116541e7-d92f-48ff-ad78-7dba2f45fc18-kube-api-access-g8h2f" (OuterVolumeSpecName: "kube-api-access-g8h2f") pod "116541e7-d92f-48ff-ad78-7dba2f45fc18" (UID: "116541e7-d92f-48ff-ad78-7dba2f45fc18"). InnerVolumeSpecName "kube-api-access-g8h2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.269613 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.281285 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291531 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291569 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291598 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291611 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291624 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbgn4\" (UniqueName: \"kubernetes.io/projected/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kube-api-access-kbgn4\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291636 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q7k2\" (UniqueName: \"kubernetes.io/projected/7effb60f-4f63-48d0-8b3e-1792e39c79d5-kube-api-access-9q7k2\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291646 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7effb60f-4f63-48d0-8b3e-1792e39c79d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291655 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8h2f\" (UniqueName: \"kubernetes.io/projected/116541e7-d92f-48ff-ad78-7dba2f45fc18-kube-api-access-g8h2f\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291665 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.291674 4786 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.294083 4786 generic.go:334] "Generic (PLEG): container finished" podID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerID="f8c4849551e632909c931cb9c230d8766911210a5e2a92f2d4a1214b86907ca7" exitCode=0 Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.294173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441","Type":"ContainerDied","Data":"f8c4849551e632909c931cb9c230d8766911210a5e2a92f2d4a1214b86907ca7"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.299044 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-config-data" (OuterVolumeSpecName: "config-data") pod "116541e7-d92f-48ff-ad78-7dba2f45fc18" (UID: "116541e7-d92f-48ff-ad78-7dba2f45fc18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.302912 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "116541e7-d92f-48ff-ad78-7dba2f45fc18" (UID: "116541e7-d92f-48ff-ad78-7dba2f45fc18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.303408 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cbebf1d2-7723-4d09-85de-a7e630caad3b","Type":"ContainerDied","Data":"0375897e15c4dd7ecbe37dd8c4b9b34d616b183dfdff1667d0aa74cb777bce21"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.303457 4786 scope.go:117] "RemoveContainer" containerID="f743b475f641e7e3d24360e430567f8d2e636c83bf391096a2497df4fab645b6" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.303427 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.310676 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7569c6d56c-2c7lj" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.311360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7569c6d56c-2c7lj" event={"ID":"4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5","Type":"ContainerDied","Data":"b7c402ea2a2d8ff40719f66c2cd8d5d07da290ffae5d34cd97e065ea0e3a53e0"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.316288 4786 generic.go:334] "Generic (PLEG): container finished" podID="116541e7-d92f-48ff-ad78-7dba2f45fc18" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" exitCode=0 Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.316410 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.317589 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"116541e7-d92f-48ff-ad78-7dba2f45fc18","Type":"ContainerDied","Data":"9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.317698 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"116541e7-d92f-48ff-ad78-7dba2f45fc18","Type":"ContainerDied","Data":"ec15dc619d7e4c9c7de040774d7a5874bebf4d51d48d5087c5449ade950574bc"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.322803 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.325642 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hncnh" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.325963 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hncnh" event={"ID":"7effb60f-4f63-48d0-8b3e-1792e39c79d5","Type":"ContainerDied","Data":"53895c4cc1d98550390d009ab9248208c4b0b4f27286b2879baa68249b7aa3fe"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.342253 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.342518 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dbb857556-g9c6x" event={"ID":"14222e06-64a4-424f-9b69-cb6d2b62c001","Type":"ContainerDied","Data":"88ab0757184513178d6464ffc74b12848dc38e911717aca34daf0d7bfcaa2800"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.347082 4786 generic.go:334] "Generic (PLEG): container finished" podID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerID="ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7" exitCode=0 Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.347140 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b28a544-1e5f-46f0-a6d9-7a147c5d737e","Type":"ContainerDied","Data":"ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.347166 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b28a544-1e5f-46f0-a6d9-7a147c5d737e","Type":"ContainerDied","Data":"b2ccbd80ec561cb0121e0d92ad8395ad6bf7bdc1ad86a2abd15af0803bba6f61"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.347219 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.365443 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.171:8776/healthcheck\": read tcp 10.217.0.2:49682->10.217.0.171:8776: read: connection reset by peer" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.376324 4786 scope.go:117] "RemoveContainer" containerID="e0a352ffff2ccb7dffda5744bd27fe06dce994d483616356c61b3ccede71f2c0" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.376484 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.378046 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dbb48765-fzcqd" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.378247 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3078-account-create-update-mxjs7" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.378999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3078-account-create-update-mxjs7" event={"ID":"530d22ba-9371-4850-8c78-26323a26ad06","Type":"ContainerDied","Data":"c42f7c509b9671f1a4efce9722ac04c0147dacaa61e879836be5c220fecc44a0"} Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.390352 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.395128 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.395149 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.395158 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116541e7-d92f-48ff-ad78-7dba2f45fc18-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.435386 4786 scope.go:117] "RemoveContainer" containerID="7543fdc8467c20ef9ae263084f0987465a93b904cd4f492afe5d1897a9e24ab1" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.457379 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7569c6d56c-2c7lj"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.474661 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8b28a544-1e5f-46f0-a6d9-7a147c5d737e" (UID: "8b28a544-1e5f-46f0-a6d9-7a147c5d737e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.476393 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7569c6d56c-2c7lj"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.493475 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.497073 4786 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b28a544-1e5f-46f0-a6d9-7a147c5d737e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.502861 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.513849 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-dbb857556-g9c6x"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.523225 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-dbb857556-g9c6x"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.549356 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3078-account-create-update-mxjs7"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.567004 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3078-account-create-update-mxjs7"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.587396 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hncnh"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.593609 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hncnh"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.607042 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-dbb48765-fzcqd"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.612367 4786 scope.go:117] "RemoveContainer" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.612769 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-dbb48765-fzcqd"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.635863 4786 scope.go:117] "RemoveContainer" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.636661 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3\": container with ID starting with 9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3 not found: ID does not exist" containerID="9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.636687 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3"} err="failed to get container status \"9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3\": rpc error: code = NotFound desc = could not find container \"9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3\": container with ID starting with 9fba6c56ce6a889b63aef316e5dbb7a822cb145aa12972e0ffe2f7e4200915a3 not found: ID does not exist" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.636707 4786 scope.go:117] "RemoveContainer" containerID="54b263afbc6e8053c748bc362cfd1bd6dfe5af3c222dd7810fe8d319eb6c30f9" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.684572 4786 scope.go:117] "RemoveContainer" containerID="d022ca1a7d8f88f231dbd73accdaf4bd33c41433e29373fd3195771df609a146" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.701781 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.706627 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.728331 4786 scope.go:117] "RemoveContainer" containerID="ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.793214 4786 scope.go:117] "RemoveContainer" containerID="fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.833143 4786 scope.go:117] "RemoveContainer" containerID="ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7" Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.835339 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7\": container with ID starting with ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7 not found: ID does not exist" containerID="ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.835369 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7"} err="failed to get container status \"ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7\": rpc error: code = NotFound desc = could not find container \"ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7\": container with ID starting with ebaae3e9650351636f79c37f9b841693d12ba98ba1c8416e05000c6cad1243a7 not found: ID does not exist" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.835388 4786 scope.go:117] "RemoveContainer" containerID="fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8" Mar 13 12:13:20 crc kubenswrapper[4786]: E0313 12:13:20.837646 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8\": container with ID starting with fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8 not found: ID does not exist" containerID="fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.837672 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8"} err="failed to get container status \"fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8\": rpc error: code = NotFound desc = could not find container \"fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8\": container with ID starting with fa22769f9eb12f7cd8a7f11597273277cd0250556ea7e6131d0613077d74d1f8 not found: ID does not exist" Mar 13 12:13:20 crc kubenswrapper[4786]: I0313 12:13:20.911721 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-scripts\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011112 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa75843b-0c7d-49c1-be09-bef85ec8fd16-logs\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011136 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-public-tls-certs\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011151 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa75843b-0c7d-49c1-be09-bef85ec8fd16-etc-machine-id\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011173 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-combined-ca-bundle\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011237 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxxn\" (UniqueName: \"kubernetes.io/projected/aa75843b-0c7d-49c1-be09-bef85ec8fd16-kube-api-access-wrxxn\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011305 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011322 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-internal-tls-certs\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.011340 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data-custom\") pod \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\" (UID: \"aa75843b-0c7d-49c1-be09-bef85ec8fd16\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.012393 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa75843b-0c7d-49c1-be09-bef85ec8fd16-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.016835 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.019177 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa75843b-0c7d-49c1-be09-bef85ec8fd16-logs" (OuterVolumeSpecName: "logs") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.019247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-scripts" (OuterVolumeSpecName: "scripts") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.022097 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa75843b-0c7d-49c1-be09-bef85ec8fd16-kube-api-access-wrxxn" (OuterVolumeSpecName: "kube-api-access-wrxxn") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "kube-api-access-wrxxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047232 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wvn8j"] Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047576 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerName="init" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047621 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerName="init" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047633 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047640 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener-log" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047647 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf1d2-7723-4d09-85de-a7e630caad3b" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047654 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf1d2-7723-4d09-85de-a7e630caad3b" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047676 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047682 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api-log" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047690 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047695 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047713 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047718 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047730 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="ovsdbserver-sb" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047735 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="ovsdbserver-sb" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047747 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerName="dnsmasq-dns" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047753 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerName="dnsmasq-dns" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047760 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerName="galera" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047766 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerName="galera" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047773 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-httpd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047778 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-httpd" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047786 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-server" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047791 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-server" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047801 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047806 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047814 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e05d002-d224-4a13-8497-fc49712f7084" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047820 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e05d002-d224-4a13-8497-fc49712f7084" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047831 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116541e7-d92f-48ff-ad78-7dba2f45fc18" containerName="nova-cell0-conductor-conductor" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047838 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="116541e7-d92f-48ff-ad78-7dba2f45fc18" containerName="nova-cell0-conductor-conductor" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047849 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047857 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047874 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="ovsdbserver-nb" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047896 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="ovsdbserver-nb" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047907 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047913 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047924 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerName="mysql-bootstrap" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047930 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerName="mysql-bootstrap" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.047941 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.047946 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048090 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="ovsdbserver-sb" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048103 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048113 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048121 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-server" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048134 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048144 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048149 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" containerName="dnsmasq-dns" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048347 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" containerName="barbican-keystone-listener-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048354 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" containerName="barbican-worker" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048362 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbebf1d2-7723-4d09-85de-a7e630caad3b" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048372 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" containerName="proxy-httpd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048380 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" containerName="galera" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.048996 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="ovsdbserver-nb" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.049007 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.049029 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerName="cinder-api" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.049040 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="116541e7-d92f-48ff-ad78-7dba2f45fc18" containerName="nova-cell0-conductor-conductor" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.049052 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e05d002-d224-4a13-8497-fc49712f7084" containerName="openstack-network-exporter" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.067151 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wvn8j"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.067251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.069817 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.081913 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112486 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w6w5\" (UniqueName: \"kubernetes.io/projected/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-kube-api-access-8w6w5\") pod \"root-account-create-update-wvn8j\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-operator-scripts\") pod \"root-account-create-update-wvn8j\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112727 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112738 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxxn\" (UniqueName: \"kubernetes.io/projected/aa75843b-0c7d-49c1-be09-bef85ec8fd16-kube-api-access-wrxxn\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112749 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112758 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112765 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa75843b-0c7d-49c1-be09-bef85ec8fd16-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.112774 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa75843b-0c7d-49c1-be09-bef85ec8fd16-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.119671 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.138359 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data" (OuterVolumeSpecName: "config-data") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.171537 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.178406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aa75843b-0c7d-49c1-be09-bef85ec8fd16" (UID: "aa75843b-0c7d-49c1-be09-bef85ec8fd16"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214011 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-scripts\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214066 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-config-data\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214162 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnhzb\" (UniqueName: \"kubernetes.io/projected/3e9745df-949d-443d-93bb-0e5b3692ccd6-kube-api-access-hnhzb\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214205 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-logs\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214273 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-httpd-run\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214350 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-internal-tls-certs\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214389 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-combined-ca-bundle\") pod \"3e9745df-949d-443d-93bb-0e5b3692ccd6\" (UID: \"3e9745df-949d-443d-93bb-0e5b3692ccd6\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214694 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w6w5\" (UniqueName: \"kubernetes.io/projected/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-kube-api-access-8w6w5\") pod \"root-account-create-update-wvn8j\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214788 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-operator-scripts\") pod \"root-account-create-update-wvn8j\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214957 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214979 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.214994 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa75843b-0c7d-49c1-be09-bef85ec8fd16-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.215472 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-logs" (OuterVolumeSpecName: "logs") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.215781 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-operator-scripts\") pod \"root-account-create-update-wvn8j\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.216652 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.219786 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-scripts" (OuterVolumeSpecName: "scripts") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.220316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.221497 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9745df-949d-443d-93bb-0e5b3692ccd6-kube-api-access-hnhzb" (OuterVolumeSpecName: "kube-api-access-hnhzb") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "kube-api-access-hnhzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.242249 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w6w5\" (UniqueName: \"kubernetes.io/projected/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-kube-api-access-8w6w5\") pod \"root-account-create-update-wvn8j\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.247149 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.292158 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.292669 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-config-data" (OuterVolumeSpecName: "config-data") pod "3e9745df-949d-443d-93bb-0e5b3692ccd6" (UID: "3e9745df-949d-443d-93bb-0e5b3692ccd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.316871 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.316943 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnhzb\" (UniqueName: \"kubernetes.io/projected/3e9745df-949d-443d-93bb-0e5b3692ccd6-kube-api-access-hnhzb\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.316953 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.316975 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e9745df-949d-443d-93bb-0e5b3692ccd6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.316985 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.316994 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.317002 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.317009 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9745df-949d-443d-93bb-0e5b3692ccd6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.383784 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.425435 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.462685 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.499173 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116541e7-d92f-48ff-ad78-7dba2f45fc18" path="/var/lib/kubelet/pods/116541e7-d92f-48ff-ad78-7dba2f45fc18/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.509253 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129e2d9e-bcc5-4fb2-815c-29d99648b1f3" path="/var/lib/kubelet/pods/129e2d9e-bcc5-4fb2-815c-29d99648b1f3/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.510346 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14222e06-64a4-424f-9b69-cb6d2b62c001" path="/var/lib/kubelet/pods/14222e06-64a4-424f-9b69-cb6d2b62c001/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.511509 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a37be46-7b90-4c56-8dcf-a3ea45123df8" path="/var/lib/kubelet/pods/3a37be46-7b90-4c56-8dcf-a3ea45123df8/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.513537 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5" path="/var/lib/kubelet/pods/4d14f8f2-86d6-41b4-a3b1-5928b3e17fa5/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.514266 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="530d22ba-9371-4850-8c78-26323a26ad06" path="/var/lib/kubelet/pods/530d22ba-9371-4850-8c78-26323a26ad06/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.514727 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e05d002-d224-4a13-8497-fc49712f7084" path="/var/lib/kubelet/pods/5e05d002-d224-4a13-8497-fc49712f7084/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.515982 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0cbbc8-b706-4a93-bd1b-442a68cce24b" path="/var/lib/kubelet/pods/7e0cbbc8-b706-4a93-bd1b-442a68cce24b/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.517490 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7effb60f-4f63-48d0-8b3e-1792e39c79d5" path="/var/lib/kubelet/pods/7effb60f-4f63-48d0-8b3e-1792e39c79d5/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.518003 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8851cf9e-656d-439d-a0d8-a16bdc843d87" path="/var/lib/kubelet/pods/8851cf9e-656d-439d-a0d8-a16bdc843d87/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.519175 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b28a544-1e5f-46f0-a6d9-7a147c5d737e" path="/var/lib/kubelet/pods/8b28a544-1e5f-46f0-a6d9-7a147c5d737e/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.519810 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbebf1d2-7723-4d09-85de-a7e630caad3b" path="/var/lib/kubelet/pods/cbebf1d2-7723-4d09-85de-a7e630caad3b/volumes" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.520690 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.520782 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.521107 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="39720781-e027-4319-9c8f-1d9134d269f8" containerName="kube-state-metrics" containerID="cri-o://e54841fed760ec7f6d2745e7319245bad6d4e8f266b28143dbe25cdfa3e60e17" gracePeriod=30 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.522471 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-central-agent" containerID="cri-o://1b0b367e7cd0a1267707201fcc6eb17e95461077f6d3e9b86822b55b231ea0c0" gracePeriod=30 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.522628 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="proxy-httpd" containerID="cri-o://19ed6f38037a55c43058db0a67693dffe38372d306c408426bb30752659582c5" gracePeriod=30 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.522670 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="sg-core" containerID="cri-o://6ca4ed1353fe2122e66b7cdc238326a51066ac9f00f84fe43e52d17f553e850a" gracePeriod=30 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.522712 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-notification-agent" containerID="cri-o://c90607a25b6719b906805f4956767bf9e8f2062f95bfc51dac8f6059d27ae384" gracePeriod=30 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.524254 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527030 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-config-data\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527168 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-public-tls-certs\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527197 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-scripts\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527235 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcjz4\" (UniqueName: \"kubernetes.io/projected/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-kube-api-access-jcjz4\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527258 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-internal-tls-certs\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527283 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-combined-ca-bundle\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.527311 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-logs\") pod \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\" (UID: \"b0d491ad-ee68-47bb-a1e3-66d22ecca41a\") " Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.535450 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-logs" (OuterVolumeSpecName: "logs") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.547793 4786 generic.go:334] "Generic (PLEG): container finished" podID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerID="8d638dd580ac1c508eca3ff370e5d3dd8062fb913eb5b6a7194fb27153cf2701" exitCode=0 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.547906 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67ad69d-5191-4d93-9326-b93b0653a82c","Type":"ContainerDied","Data":"8d638dd580ac1c508eca3ff370e5d3dd8062fb913eb5b6a7194fb27153cf2701"} Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.577479 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-kube-api-access-jcjz4" (OuterVolumeSpecName: "kube-api-access-jcjz4") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "kube-api-access-jcjz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.579953 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" containerID="be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a" exitCode=0 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.580041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aa75843b-0c7d-49c1-be09-bef85ec8fd16","Type":"ContainerDied","Data":"be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a"} Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.580077 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aa75843b-0c7d-49c1-be09-bef85ec8fd16","Type":"ContainerDied","Data":"760611395e73b3b83a10b25c81ff8ebd3281524b0deaab10939f310d0fc47f02"} Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.580098 4786 scope.go:117] "RemoveContainer" containerID="be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.580098 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.608127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-scripts" (OuterVolumeSpecName: "scripts") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.623614 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerID="00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120" exitCode=0 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.623708 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e9745df-949d-443d-93bb-0e5b3692ccd6","Type":"ContainerDied","Data":"00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120"} Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.623736 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e9745df-949d-443d-93bb-0e5b3692ccd6","Type":"ContainerDied","Data":"c855a610ac23727fbb84ed8ff32f53ad32c9347c74559efd48b339b33cf3996b"} Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.623804 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.628417 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcjz4\" (UniqueName: \"kubernetes.io/projected/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-kube-api-access-jcjz4\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.628440 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.628449 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.662574 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.708793 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.738680 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.740423 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerID="52dda96d5af850effe132c05ad903de42b92e2b4d4cba2475c20cf70be8ffde6" exitCode=0 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.740451 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88c9bb894-zzsvv" event={"ID":"b0d491ad-ee68-47bb-a1e3-66d22ecca41a","Type":"ContainerDied","Data":"52dda96d5af850effe132c05ad903de42b92e2b4d4cba2475c20cf70be8ffde6"} Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.740537 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88c9bb894-zzsvv" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.764514 4786 scope.go:117] "RemoveContainer" containerID="9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.771278 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.793632 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.793836 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="e4f5bdf5-c352-4722-bcbd-704965ab36f0" containerName="memcached" containerID="cri-o://f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf" gracePeriod=30 Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.808848 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5794-account-create-update-5vccb"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.815923 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d9fb9c86-4lc8x" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.824005 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d9fb9c86-4lc8x" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.830152 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5794-account-create-update-5vccb"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838335 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5794-account-create-update-dcwbd"] Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.838679 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-httpd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838691 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-httpd" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.838705 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838711 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-log" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.838727 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838733 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-log" Mar 13 12:13:21 crc kubenswrapper[4786]: E0313 12:13:21.838747 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-api" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838752 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-api" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838908 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838931 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-api" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838939 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" containerName="placement-log" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.838954 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" containerName="glance-httpd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.841097 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.857782 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5794-account-create-update-dcwbd"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.866308 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.877332 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4l225"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.923273 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4l225"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.947297 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.947450 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": read tcp 10.217.0.2:51952->10.217.0.213:8775: read: connection reset by peer" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.947508 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": read tcp 10.217.0.2:51958->10.217.0.213:8775: read: connection reset by peer" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.947551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55kt\" (UniqueName: \"kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.969495 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d7547f9f8-fzqlk"] Mar 13 12:13:21 crc kubenswrapper[4786]: I0313 12:13:21.969718 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-d7547f9f8-fzqlk" podUID="c03ed618-9a09-48b0-84d4-873357872d22" containerName="keystone-api" containerID="cri-o://495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc" gracePeriod=30 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.013863 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gtn8s"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.058448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.061850 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-config-data" (OuterVolumeSpecName: "config-data") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.063562 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.063614 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55kt\" (UniqueName: \"kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.064377 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.064417 4786 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.064462 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts podName:d0510911-6cff-44a7-be99-81a055f7197a nodeName:}" failed. No retries permitted until 2026-03-13 12:13:22.564443157 +0000 UTC m=+1589.844096604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts") pod "keystone-5794-account-create-update-dcwbd" (UID: "d0510911-6cff-44a7-be99-81a055f7197a") : configmap "openstack-scripts" not found Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.065301 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.073373 4786 projected.go:194] Error preparing data for projected volume kube-api-access-p55kt for pod openstack/keystone-5794-account-create-update-dcwbd: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.073451 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt podName:d0510911-6cff-44a7-be99-81a055f7197a nodeName:}" failed. No retries permitted until 2026-03-13 12:13:22.573433502 +0000 UTC m=+1589.853086949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p55kt" (UniqueName: "kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt") pod "keystone-5794-account-create-update-dcwbd" (UID: "d0510911-6cff-44a7-be99-81a055f7197a") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.082259 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gtn8s"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.120230 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.130762 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.139860 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b0d491ad-ee68-47bb-a1e3-66d22ecca41a" (UID: "b0d491ad-ee68-47bb-a1e3-66d22ecca41a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.139950 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5794-account-create-update-dcwbd"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.145203 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bsjm4"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.148782 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bsjm4"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.154849 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wvn8j"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.181673 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.181694 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d491ad-ee68-47bb-a1e3-66d22ecca41a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.183355 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.186708 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.197984 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.198032 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="ovn-northd" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.318129 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="galera" containerID="cri-o://71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f" gracePeriod=30 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.392297 4786 scope.go:117] "RemoveContainer" containerID="be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.396048 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a\": container with ID starting with be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a not found: ID does not exist" containerID="be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.396369 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a"} err="failed to get container status \"be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a\": rpc error: code = NotFound desc = could not find container \"be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a\": container with ID starting with be12833c5c5a425811c164a8335668f605b764c235e7e7c211cd29ff59fc851a not found: ID does not exist" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.396393 4786 scope.go:117] "RemoveContainer" containerID="9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.404405 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-p55kt operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-5794-account-create-update-dcwbd" podUID="d0510911-6cff-44a7-be99-81a055f7197a" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.404899 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-88c9bb894-zzsvv"] Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.413494 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410\": container with ID starting with 9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410 not found: ID does not exist" containerID="9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.413532 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410"} err="failed to get container status \"9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410\": rpc error: code = NotFound desc = could not find container \"9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410\": container with ID starting with 9a7cef1d0c827ab79ddd4b27b25b98950fcf7e06c21e8512d95adf09bd2f7410 not found: ID does not exist" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.413555 4786 scope.go:117] "RemoveContainer" containerID="00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.415404 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-88c9bb894-zzsvv"] Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.431302 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.460427 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.506451 4786 scope.go:117] "RemoveContainer" containerID="bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.512098 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wvn8j"] Mar 13 12:13:22 crc kubenswrapper[4786]: W0313 12:13:22.546183 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc63ed5a_1b9a_4d6d_bcec_8f7b282642e0.slice/crio-7a248a988b555710968a2b6da3d37f25285eaf08301fda3b751d5b6287eb2ebe WatchSource:0}: Error finding container 7a248a988b555710968a2b6da3d37f25285eaf08301fda3b751d5b6287eb2ebe: Status 404 returned error can't find the container with id 7a248a988b555710968a2b6da3d37f25285eaf08301fda3b751d5b6287eb2ebe Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.571928 4786 scope.go:117] "RemoveContainer" containerID="00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.572336 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120\": container with ID starting with 00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120 not found: ID does not exist" containerID="00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.572362 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120"} err="failed to get container status \"00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120\": rpc error: code = NotFound desc = could not find container \"00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120\": container with ID starting with 00aa3c5dcfb4327ae05fc8de1b3da58f47fbdcda25f361aa6690865e5aa93120 not found: ID does not exist" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.572381 4786 scope.go:117] "RemoveContainer" containerID="bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.572729 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3\": container with ID starting with bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3 not found: ID does not exist" containerID="bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.572752 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3"} err="failed to get container status \"bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3\": rpc error: code = NotFound desc = could not find container \"bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3\": container with ID starting with bd43fb63c07bb6394d9a942e167a4072b59801bdd761ce8c38070813a9b47be3 not found: ID does not exist" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.572765 4786 scope.go:117] "RemoveContainer" containerID="52dda96d5af850effe132c05ad903de42b92e2b4d4cba2475c20cf70be8ffde6" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.572525 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 12:13:22 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 12:13:22 crc kubenswrapper[4786]: Mar 13 12:13:22 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 12:13:22 crc kubenswrapper[4786]: Mar 13 12:13:22 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 12:13:22 crc kubenswrapper[4786]: Mar 13 12:13:22 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 12:13:22 crc kubenswrapper[4786]: Mar 13 12:13:22 crc kubenswrapper[4786]: if [ -n "" ]; then Mar 13 12:13:22 crc kubenswrapper[4786]: GRANT_DATABASE="" Mar 13 12:13:22 crc kubenswrapper[4786]: else Mar 13 12:13:22 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 12:13:22 crc kubenswrapper[4786]: fi Mar 13 12:13:22 crc kubenswrapper[4786]: Mar 13 12:13:22 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 12:13:22 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 12:13:22 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 12:13:22 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 12:13:22 crc kubenswrapper[4786]: # support updates Mar 13 12:13:22 crc kubenswrapper[4786]: Mar 13 12:13:22 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.575547 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-wvn8j" podUID="fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592365 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-internal-tls-certs\") pod \"f67ad69d-5191-4d93-9326-b93b0653a82c\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592424 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67ad69d-5191-4d93-9326-b93b0653a82c-logs\") pod \"f67ad69d-5191-4d93-9326-b93b0653a82c\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592474 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8cx7\" (UniqueName: \"kubernetes.io/projected/124c632a-4ff3-419c-9e26-ba68929feeb7-kube-api-access-s8cx7\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592506 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk4k9\" (UniqueName: \"kubernetes.io/projected/f67ad69d-5191-4d93-9326-b93b0653a82c-kube-api-access-zk4k9\") pod \"f67ad69d-5191-4d93-9326-b93b0653a82c\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592525 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-combined-ca-bundle\") pod \"f67ad69d-5191-4d93-9326-b93b0653a82c\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-internal-tls-certs\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592582 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/124c632a-4ff3-419c-9e26-ba68929feeb7-logs\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-public-tls-certs\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592657 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592681 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-config-data\") pod \"f67ad69d-5191-4d93-9326-b93b0653a82c\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592699 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-combined-ca-bundle\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592718 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data-custom\") pod \"124c632a-4ff3-419c-9e26-ba68929feeb7\" (UID: \"124c632a-4ff3-419c-9e26-ba68929feeb7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.592815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-public-tls-certs\") pod \"f67ad69d-5191-4d93-9326-b93b0653a82c\" (UID: \"f67ad69d-5191-4d93-9326-b93b0653a82c\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.593161 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.593203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55kt\" (UniqueName: \"kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.594497 4786 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.594543 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts podName:d0510911-6cff-44a7-be99-81a055f7197a nodeName:}" failed. No retries permitted until 2026-03-13 12:13:23.594529371 +0000 UTC m=+1590.874182818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts") pod "keystone-5794-account-create-update-dcwbd" (UID: "d0510911-6cff-44a7-be99-81a055f7197a") : configmap "openstack-scripts" not found Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.599282 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67ad69d-5191-4d93-9326-b93b0653a82c-logs" (OuterVolumeSpecName: "logs") pod "f67ad69d-5191-4d93-9326-b93b0653a82c" (UID: "f67ad69d-5191-4d93-9326-b93b0653a82c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.601520 4786 projected.go:194] Error preparing data for projected volume kube-api-access-p55kt for pod openstack/keystone-5794-account-create-update-dcwbd: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 12:13:22 crc kubenswrapper[4786]: E0313 12:13:22.601576 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt podName:d0510911-6cff-44a7-be99-81a055f7197a nodeName:}" failed. No retries permitted until 2026-03-13 12:13:23.601559732 +0000 UTC m=+1590.881213169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p55kt" (UniqueName: "kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt") pod "keystone-5794-account-create-update-dcwbd" (UID: "d0510911-6cff-44a7-be99-81a055f7197a") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.602315 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124c632a-4ff3-419c-9e26-ba68929feeb7-logs" (OuterVolumeSpecName: "logs") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.602465 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.602792 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67ad69d-5191-4d93-9326-b93b0653a82c-kube-api-access-zk4k9" (OuterVolumeSpecName: "kube-api-access-zk4k9") pod "f67ad69d-5191-4d93-9326-b93b0653a82c" (UID: "f67ad69d-5191-4d93-9326-b93b0653a82c"). InnerVolumeSpecName "kube-api-access-zk4k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.605603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124c632a-4ff3-419c-9e26-ba68929feeb7-kube-api-access-s8cx7" (OuterVolumeSpecName: "kube-api-access-s8cx7") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "kube-api-access-s8cx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.613222 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.618183 4786 scope.go:117] "RemoveContainer" containerID="f34dd912eb47d002fd56518d38540c1994f7c17513d2933e712f79bc0fca64c8" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.635026 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-config-data" (OuterVolumeSpecName: "config-data") pod "f67ad69d-5191-4d93-9326-b93b0653a82c" (UID: "f67ad69d-5191-4d93-9326-b93b0653a82c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.659555 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f67ad69d-5191-4d93-9326-b93b0653a82c" (UID: "f67ad69d-5191-4d93-9326-b93b0653a82c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.667944 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.686076 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f67ad69d-5191-4d93-9326-b93b0653a82c" (UID: "f67ad69d-5191-4d93-9326-b93b0653a82c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695509 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695583 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67ad69d-5191-4d93-9326-b93b0653a82c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695596 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8cx7\" (UniqueName: \"kubernetes.io/projected/124c632a-4ff3-419c-9e26-ba68929feeb7-kube-api-access-s8cx7\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695605 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk4k9\" (UniqueName: \"kubernetes.io/projected/f67ad69d-5191-4d93-9326-b93b0653a82c-kube-api-access-zk4k9\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695614 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695622 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/124c632a-4ff3-419c-9e26-ba68929feeb7-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695657 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695665 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.695673 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.700062 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data" (OuterVolumeSpecName: "config-data") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.749441 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.752290 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f67ad69d-5191-4d93-9326-b93b0653a82c" (UID: "f67ad69d-5191-4d93-9326-b93b0653a82c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.770016 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "124c632a-4ff3-419c-9e26-ba68929feeb7" (UID: "124c632a-4ff3-419c-9e26-ba68929feeb7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.786584 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wvn8j" event={"ID":"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0","Type":"ContainerStarted","Data":"7a248a988b555710968a2b6da3d37f25285eaf08301fda3b751d5b6287eb2ebe"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.787849 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.796168 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-logs\") pod \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.796219 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-config-data\") pod \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.796238 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6t64\" (UniqueName: \"kubernetes.io/projected/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-kube-api-access-f6t64\") pod \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.796323 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-combined-ca-bundle\") pod \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.796382 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-nova-metadata-tls-certs\") pod \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\" (UID: \"1a06f1e9-ddda-42a5-ab33-88473c56a6c7\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.797175 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67ad69d-5191-4d93-9326-b93b0653a82c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.797196 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.797205 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.797213 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c632a-4ff3-419c-9e26-ba68929feeb7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.797751 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-logs" (OuterVolumeSpecName: "logs") pod "1a06f1e9-ddda-42a5-ab33-88473c56a6c7" (UID: "1a06f1e9-ddda-42a5-ab33-88473c56a6c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.801025 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-kube-api-access-f6t64" (OuterVolumeSpecName: "kube-api-access-f6t64") pod "1a06f1e9-ddda-42a5-ab33-88473c56a6c7" (UID: "1a06f1e9-ddda-42a5-ab33-88473c56a6c7"). InnerVolumeSpecName "kube-api-access-f6t64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.807479 4786 generic.go:334] "Generic (PLEG): container finished" podID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerID="2a0a684c9b4c3a0e217496dbf85c36bb9e3e0ef4e9a768baeea5f590927cf39d" exitCode=0 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.807591 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ab23f85-03a5-4df3-bfa8-da6f748f44e3","Type":"ContainerDied","Data":"2a0a684c9b4c3a0e217496dbf85c36bb9e3e0ef4e9a768baeea5f590927cf39d"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.834188 4786 generic.go:334] "Generic (PLEG): container finished" podID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerID="0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77" exitCode=0 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.834256 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1a06f1e9-ddda-42a5-ab33-88473c56a6c7","Type":"ContainerDied","Data":"0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.834304 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1a06f1e9-ddda-42a5-ab33-88473c56a6c7","Type":"ContainerDied","Data":"e6184906d6200c8db201c66032f8cbd84a03cf95163401dd92418357eaba8f81"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.834332 4786 scope.go:117] "RemoveContainer" containerID="0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.834246 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.842901 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-config-data" (OuterVolumeSpecName: "config-data") pod "1a06f1e9-ddda-42a5-ab33-88473c56a6c7" (UID: "1a06f1e9-ddda-42a5-ab33-88473c56a6c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.862918 4786 generic.go:334] "Generic (PLEG): container finished" podID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerID="6ca4ed1353fe2122e66b7cdc238326a51066ac9f00f84fe43e52d17f553e850a" exitCode=2 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.862951 4786 generic.go:334] "Generic (PLEG): container finished" podID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerID="1b0b367e7cd0a1267707201fcc6eb17e95461077f6d3e9b86822b55b231ea0c0" exitCode=0 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.863028 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerDied","Data":"6ca4ed1353fe2122e66b7cdc238326a51066ac9f00f84fe43e52d17f553e850a"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.863062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerDied","Data":"1b0b367e7cd0a1267707201fcc6eb17e95461077f6d3e9b86822b55b231ea0c0"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.866539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67ad69d-5191-4d93-9326-b93b0653a82c","Type":"ContainerDied","Data":"bf5334205b37b6746d50b1f253519d2ca194eba19b9241b1a6d5f55131cafa2d"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.866649 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.895127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a06f1e9-ddda-42a5-ab33-88473c56a6c7" (UID: "1a06f1e9-ddda-42a5-ab33-88473c56a6c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.899628 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-combined-ca-bundle\") pod \"39720781-e027-4319-9c8f-1d9134d269f8\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.899759 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-certs\") pod \"39720781-e027-4319-9c8f-1d9134d269f8\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.899793 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-config\") pod \"39720781-e027-4319-9c8f-1d9134d269f8\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.899833 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7d6q\" (UniqueName: \"kubernetes.io/projected/39720781-e027-4319-9c8f-1d9134d269f8-kube-api-access-g7d6q\") pod \"39720781-e027-4319-9c8f-1d9134d269f8\" (UID: \"39720781-e027-4319-9c8f-1d9134d269f8\") " Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.900661 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.900689 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.900700 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.900747 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6t64\" (UniqueName: \"kubernetes.io/projected/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-kube-api-access-f6t64\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.918427 4786 generic.go:334] "Generic (PLEG): container finished" podID="39720781-e027-4319-9c8f-1d9134d269f8" containerID="e54841fed760ec7f6d2745e7319245bad6d4e8f266b28143dbe25cdfa3e60e17" exitCode=2 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.918788 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39720781-e027-4319-9c8f-1d9134d269f8","Type":"ContainerDied","Data":"e54841fed760ec7f6d2745e7319245bad6d4e8f266b28143dbe25cdfa3e60e17"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.919001 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.929228 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "39720781-e027-4319-9c8f-1d9134d269f8" (UID: "39720781-e027-4319-9c8f-1d9134d269f8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.935395 4786 generic.go:334] "Generic (PLEG): container finished" podID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerID="2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689" exitCode=0 Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.935459 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.935548 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fb9c86-4lc8x" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.935603 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fb9c86-4lc8x" event={"ID":"124c632a-4ff3-419c-9e26-ba68929feeb7","Type":"ContainerDied","Data":"2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.935628 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fb9c86-4lc8x" event={"ID":"124c632a-4ff3-419c-9e26-ba68929feeb7","Type":"ContainerDied","Data":"b7f507a6996927896ac0fac8ac658c01d7c23f1bf0a3cfe59c5613fae465b2d3"} Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.953599 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1a06f1e9-ddda-42a5-ab33-88473c56a6c7" (UID: "1a06f1e9-ddda-42a5-ab33-88473c56a6c7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.967453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39720781-e027-4319-9c8f-1d9134d269f8" (UID: "39720781-e027-4319-9c8f-1d9134d269f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.975846 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39720781-e027-4319-9c8f-1d9134d269f8-kube-api-access-g7d6q" (OuterVolumeSpecName: "kube-api-access-g7d6q") pod "39720781-e027-4319-9c8f-1d9134d269f8" (UID: "39720781-e027-4319-9c8f-1d9134d269f8"). InnerVolumeSpecName "kube-api-access-g7d6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:22 crc kubenswrapper[4786]: I0313 12:13:22.998701 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "39720781-e027-4319-9c8f-1d9134d269f8" (UID: "39720781-e027-4319-9c8f-1d9134d269f8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.012770 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.012933 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a06f1e9-ddda-42a5-ab33-88473c56a6c7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.012947 4786 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.013082 4786 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39720781-e027-4319-9c8f-1d9134d269f8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.013098 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7d6q\" (UniqueName: \"kubernetes.io/projected/39720781-e027-4319-9c8f-1d9134d269f8-kube-api-access-g7d6q\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.091988 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4f5bdf5_c352_4722_bcbd_704965ab36f0.slice/crio-conmon-f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf.scope\": RecentStats: unable to find data in memory cache]" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.205719 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.206310 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.206813 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.206845 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.211338 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.213083 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.217032 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.217092 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.249952 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d9fb9c86-4lc8x"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.255763 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.256171 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.262534 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9d9fb9c86-4lc8x"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.281810 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.310855 4786 scope.go:117] "RemoveContainer" containerID="8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.327416 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.335955 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.340430 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.345617 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.366208 4786 scope.go:117] "RemoveContainer" containerID="0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.366551 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.366593 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" containerName="nova-scheduler-scheduler" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.370577 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77\": container with ID starting with 0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77 not found: ID does not exist" containerID="0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.370633 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77"} err="failed to get container status \"0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77\": rpc error: code = NotFound desc = could not find container \"0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77\": container with ID starting with 0544f2af6511ff840c9d62209dec7bce3e77b105b67902967cd2163686497f77 not found: ID does not exist" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.370662 4786 scope.go:117] "RemoveContainer" containerID="8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.371346 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2\": container with ID starting with 8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2 not found: ID does not exist" containerID="8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.371405 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2"} err="failed to get container status \"8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2\": rpc error: code = NotFound desc = could not find container \"8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2\": container with ID starting with 8c4446bf4f9de40cbfcd186e433cc3663d8a2010eb918a6d2f86e474455f9ee2 not found: ID does not exist" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.371431 4786 scope.go:117] "RemoveContainer" containerID="8d638dd580ac1c508eca3ff370e5d3dd8062fb913eb5b6a7194fb27153cf2701" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.373589 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.376286 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.396452 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.404034 4786 scope.go:117] "RemoveContainer" containerID="bde1fb753ad4d23d69552efc3946e3c4ff275d991136e6e6fc724f42a4350c75" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.408952 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.415045 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.416387 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.417616 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.417648 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerName="nova-cell1-conductor-conductor" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.430723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-httpd-run\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.430761 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.430870 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-config-data\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.430927 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-combined-ca-bundle\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.430953 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s749p\" (UniqueName: \"kubernetes.io/projected/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-kube-api-access-s749p\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.430995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-logs\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.431020 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-public-tls-certs\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.431040 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-scripts\") pod \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\" (UID: \"4ab23f85-03a5-4df3-bfa8-da6f748f44e3\") " Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.431436 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.431488 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data podName:53fea24b-7ca8-4c0a-96d1-458ca1e877a7 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:31.43147383 +0000 UTC m=+1598.711127277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data") pod "rabbitmq-server-0" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7") : configmap "rabbitmq-config-data" not found Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.438480 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.444323 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-logs" (OuterVolumeSpecName: "logs") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.444487 4786 scope.go:117] "RemoveContainer" containerID="e54841fed760ec7f6d2745e7319245bad6d4e8f266b28143dbe25cdfa3e60e17" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.447022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-kube-api-access-s749p" (OuterVolumeSpecName: "kube-api-access-s749p") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "kube-api-access-s749p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.447276 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.452522 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-scripts" (OuterVolumeSpecName: "scripts") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.455284 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" path="/var/lib/kubelet/pods/124c632a-4ff3-419c-9e26-ba68929feeb7/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.456106 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" path="/var/lib/kubelet/pods/1a06f1e9-ddda-42a5-ab33-88473c56a6c7/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.456630 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39720781-e027-4319-9c8f-1d9134d269f8" path="/var/lib/kubelet/pods/39720781-e027-4319-9c8f-1d9134d269f8/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.457694 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9745df-949d-443d-93bb-0e5b3692ccd6" path="/var/lib/kubelet/pods/3e9745df-949d-443d-93bb-0e5b3692ccd6/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.458351 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40623686-b681-4c6b-aa73-5b5ac94e4a4c" path="/var/lib/kubelet/pods/40623686-b681-4c6b-aa73-5b5ac94e4a4c/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.458904 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87da50ad-41f1-4208-b36c-d874ac2250c7" path="/var/lib/kubelet/pods/87da50ad-41f1-4208-b36c-d874ac2250c7/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.460035 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa75843b-0c7d-49c1-be09-bef85ec8fd16" path="/var/lib/kubelet/pods/aa75843b-0c7d-49c1-be09-bef85ec8fd16/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.460932 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d491ad-ee68-47bb-a1e3-66d22ecca41a" path="/var/lib/kubelet/pods/b0d491ad-ee68-47bb-a1e3-66d22ecca41a/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.461459 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce773763-3741-4253-87c8-9726920b41dc" path="/var/lib/kubelet/pods/ce773763-3741-4253-87c8-9726920b41dc/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.464578 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" path="/var/lib/kubelet/pods/f67ad69d-5191-4d93-9326-b93b0653a82c/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.465351 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbc9776-7662-41d9-93ac-38c5b98709ab" path="/var/lib/kubelet/pods/ffbc9776-7662-41d9-93ac-38c5b98709ab/volumes" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.479453 4786 scope.go:117] "RemoveContainer" containerID="2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.488775 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.502239 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.514294 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-config-data" (OuterVolumeSpecName: "config-data") pod "4ab23f85-03a5-4df3-bfa8-da6f748f44e3" (UID: "4ab23f85-03a5-4df3-bfa8-da6f748f44e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.532228 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-operator-scripts\") pod \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.532330 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w6w5\" (UniqueName: \"kubernetes.io/projected/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-kube-api-access-8w6w5\") pod \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\" (UID: \"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.532945 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.532969 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s749p\" (UniqueName: \"kubernetes.io/projected/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-kube-api-access-s749p\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.532984 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.532995 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.533007 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.533027 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.533050 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.533235 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab23f85-03a5-4df3-bfa8-da6f748f44e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.534255 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0" (UID: "fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.542093 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-kube-api-access-8w6w5" (OuterVolumeSpecName: "kube-api-access-8w6w5") pod "fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0" (UID: "fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0"). InnerVolumeSpecName "kube-api-access-8w6w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.551339 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.571775 4786 scope.go:117] "RemoveContainer" containerID="09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.611265 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.637261 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.637324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55kt\" (UniqueName: \"kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt\") pod \"keystone-5794-account-create-update-dcwbd\" (UID: \"d0510911-6cff-44a7-be99-81a055f7197a\") " pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.637397 4786 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.637464 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts podName:d0510911-6cff-44a7-be99-81a055f7197a nodeName:}" failed. No retries permitted until 2026-03-13 12:13:25.637444979 +0000 UTC m=+1592.917098506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts") pod "keystone-5794-account-create-update-dcwbd" (UID: "d0510911-6cff-44a7-be99-81a055f7197a") : configmap "openstack-scripts" not found Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.637497 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.638093 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.638123 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w6w5\" (UniqueName: \"kubernetes.io/projected/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0-kube-api-access-8w6w5\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.639834 4786 projected.go:194] Error preparing data for projected volume kube-api-access-p55kt for pod openstack/keystone-5794-account-create-update-dcwbd: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.639913 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt podName:d0510911-6cff-44a7-be99-81a055f7197a nodeName:}" failed. No retries permitted until 2026-03-13 12:13:25.639897066 +0000 UTC m=+1592.919550593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p55kt" (UniqueName: "kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt") pod "keystone-5794-account-create-update-dcwbd" (UID: "d0510911-6cff-44a7-be99-81a055f7197a") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.656495 4786 scope.go:117] "RemoveContainer" containerID="2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.658269 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689\": container with ID starting with 2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689 not found: ID does not exist" containerID="2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.658308 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689"} err="failed to get container status \"2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689\": rpc error: code = NotFound desc = could not find container \"2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689\": container with ID starting with 2758d47ec08ae3020c1eeb9f4accf777d21b374ac3409310e1b2573e3f9b8689 not found: ID does not exist" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.658334 4786 scope.go:117] "RemoveContainer" containerID="09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b" Mar 13 12:13:23 crc kubenswrapper[4786]: E0313 12:13:23.662629 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b\": container with ID starting with 09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b not found: ID does not exist" containerID="09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.662672 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b"} err="failed to get container status \"09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b\": rpc error: code = NotFound desc = could not find container \"09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b\": container with ID starting with 09e33b2c6a674d69e8c0d17fe762bf4f1cdb66c17266e9d1a206d085bd08396b not found: ID does not exist" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.668636 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.745280 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-memcached-tls-certs\") pod \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.745361 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-combined-ca-bundle\") pod \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.745412 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-config-data\") pod \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.745459 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kolla-config\") pod \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.745504 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbdbz\" (UniqueName: \"kubernetes.io/projected/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kube-api-access-hbdbz\") pod \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\" (UID: \"e4f5bdf5-c352-4722-bcbd-704965ab36f0\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.749151 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kube-api-access-hbdbz" (OuterVolumeSpecName: "kube-api-access-hbdbz") pod "e4f5bdf5-c352-4722-bcbd-704965ab36f0" (UID: "e4f5bdf5-c352-4722-bcbd-704965ab36f0"). InnerVolumeSpecName "kube-api-access-hbdbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.749380 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e4f5bdf5-c352-4722-bcbd-704965ab36f0" (UID: "e4f5bdf5-c352-4722-bcbd-704965ab36f0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.749398 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-config-data" (OuterVolumeSpecName: "config-data") pod "e4f5bdf5-c352-4722-bcbd-704965ab36f0" (UID: "e4f5bdf5-c352-4722-bcbd-704965ab36f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.779517 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4f5bdf5-c352-4722-bcbd-704965ab36f0" (UID: "e4f5bdf5-c352-4722-bcbd-704965ab36f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.792694 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "e4f5bdf5-c352-4722-bcbd-704965ab36f0" (UID: "e4f5bdf5-c352-4722-bcbd-704965ab36f0"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.844725 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-666xn" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" probeResult="failure" output="command timed out" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.848841 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-confd\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.848901 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnmkb\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-kube-api-access-mnmkb\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.848928 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-pod-info\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849001 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-config-data\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849060 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-server-conf\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-plugins-conf\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849123 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-erlang-cookie-secret\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849156 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-plugins\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849194 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-erlang-cookie\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849330 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-tls\") pod \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\" (UID: \"3b196d91-2a1f-4ee5-81d5-0133f2917cc5\") " Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849617 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849794 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbdbz\" (UniqueName: \"kubernetes.io/projected/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kube-api-access-hbdbz\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849820 4786 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849833 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849844 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f5bdf5-c352-4722-bcbd-704965ab36f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849855 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.849868 4786 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4f5bdf5-c352-4722-bcbd-704965ab36f0-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.850083 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.851861 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.853103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-pod-info" (OuterVolumeSpecName: "pod-info") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.854580 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.855157 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.858199 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.876516 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-kube-api-access-mnmkb" (OuterVolumeSpecName: "kube-api-access-mnmkb") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "kube-api-access-mnmkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.886167 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-666xn" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:13:23 crc kubenswrapper[4786]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 13 12:13:23 crc kubenswrapper[4786]: > Mar 13 12:13:23 crc kubenswrapper[4786]: I0313 12:13:23.887949 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-config-data" (OuterVolumeSpecName: "config-data") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.929328 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-server-conf" (OuterVolumeSpecName: "server-conf") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.941704 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3b196d91-2a1f-4ee5-81d5-0133f2917cc5" (UID: "3b196d91-2a1f-4ee5-81d5-0133f2917cc5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951841 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951871 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951903 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951918 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951943 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951954 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951966 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951977 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951988 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnmkb\" (UniqueName: \"kubernetes.io/projected/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-kube-api-access-mnmkb\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.951998 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b196d91-2a1f-4ee5-81d5-0133f2917cc5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.959320 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wvn8j" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.959310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wvn8j" event={"ID":"fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0","Type":"ContainerDied","Data":"7a248a988b555710968a2b6da3d37f25285eaf08301fda3b751d5b6287eb2ebe"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.974747 4786 generic.go:334] "Generic (PLEG): container finished" podID="e4f5bdf5-c352-4722-bcbd-704965ab36f0" containerID="f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf" exitCode=0 Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.974779 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.974794 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e4f5bdf5-c352-4722-bcbd-704965ab36f0","Type":"ContainerDied","Data":"f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.975327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e4f5bdf5-c352-4722-bcbd-704965ab36f0","Type":"ContainerDied","Data":"2ca966fc1d68b16911c94352fe111f31c45089d9f16ba2c5cec8e8282997d80e"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.975345 4786 scope.go:117] "RemoveContainer" containerID="f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.976613 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.995706 4786 generic.go:334] "Generic (PLEG): container finished" podID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerID="8546d15d615043030d104f666fcccae710b91eaabc4b545097a038170b3a7dcf" exitCode=0 Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.995765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53fea24b-7ca8-4c0a-96d1-458ca1e877a7","Type":"ContainerDied","Data":"8546d15d615043030d104f666fcccae710b91eaabc4b545097a038170b3a7dcf"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.995790 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53fea24b-7ca8-4c0a-96d1-458ca1e877a7","Type":"ContainerDied","Data":"ab3c6791a13213692e74230d9319ea4b6b280cd4c58b66916b1745cd9aa92039"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.995828 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab3c6791a13213692e74230d9319ea4b6b280cd4c58b66916b1745cd9aa92039" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.998599 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ab23f85-03a5-4df3-bfa8-da6f748f44e3","Type":"ContainerDied","Data":"db587533ffe68bfd326df85ca0eeb44b7da6d8e17ff5b9a8fb4b15d34c97a2bf"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:23.998708 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.002516 4786 generic.go:334] "Generic (PLEG): container finished" podID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerID="19ed6f38037a55c43058db0a67693dffe38372d306c408426bb30752659582c5" exitCode=0 Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.002599 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerDied","Data":"19ed6f38037a55c43058db0a67693dffe38372d306c408426bb30752659582c5"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.004602 4786 generic.go:334] "Generic (PLEG): container finished" podID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerID="6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943" exitCode=0 Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.004696 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5794-account-create-update-dcwbd" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.004706 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b196d91-2a1f-4ee5-81d5-0133f2917cc5","Type":"ContainerDied","Data":"6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.004728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b196d91-2a1f-4ee5-81d5-0133f2917cc5","Type":"ContainerDied","Data":"cfcd50ddcccf4394896df1c362f592c8dd9d7067ac5cac9cf40148279326b5db"} Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.004630 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.054056 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.108494 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.118455 4786 scope.go:117] "RemoveContainer" containerID="f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf" Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.118814 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf\": container with ID starting with f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf not found: ID does not exist" containerID="f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.119085 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf"} err="failed to get container status \"f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf\": rpc error: code = NotFound desc = could not find container \"f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf\": container with ID starting with f003e8af7fef7171e4f5a5b16c8479ad608fc75ee98eaae17c9b28b4df66adbf not found: ID does not exist" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.119118 4786 scope.go:117] "RemoveContainer" containerID="2a0a684c9b4c3a0e217496dbf85c36bb9e3e0ef4e9a768baeea5f590927cf39d" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.161522 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.174938 4786 scope.go:117] "RemoveContainer" containerID="b9aec14b391a1bbbd8f466a3df6625873e9ec6de58fe63728da3a16855652999" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.175671 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.197086 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.223652 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.231014 4786 scope.go:117] "RemoveContainer" containerID="6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.257210 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5794-account-create-update-dcwbd"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.268950 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5794-account-create-update-dcwbd"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270500 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drrl5\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-kube-api-access-drrl5\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270562 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-plugins-conf\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270584 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-confd\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270599 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-tls\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270645 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270667 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-server-conf\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270694 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-pod-info\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270725 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-erlang-cookie-secret\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-erlang-cookie\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270767 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-plugins\") pod \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\" (UID: \"53fea24b-7ca8-4c0a-96d1-458ca1e877a7\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.270971 4786 scope.go:117] "RemoveContainer" containerID="aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.271032 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p55kt\" (UniqueName: \"kubernetes.io/projected/d0510911-6cff-44a7-be99-81a055f7197a-kube-api-access-p55kt\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.271046 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0510911-6cff-44a7-be99-81a055f7197a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.271122 4786 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.271172 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:32.271159164 +0000 UTC m=+1599.550812611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-config-data" not found Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.274717 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.276484 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.278815 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.279593 4786 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.281649 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:32.281616388 +0000 UTC m=+1599.561269835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scripts" not found Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.279641 4786 secret.go:188] Couldn't get secret openstack/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.282126 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom podName:c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441 nodeName:}" failed. No retries permitted until 2026-03-13 12:13:32.282116961 +0000 UTC m=+1599.561770408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom") pod "cinder-scheduler-0" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441") : secret "cinder-scheduler-config-data" not found Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.287212 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-kube-api-access-drrl5" (OuterVolumeSpecName: "kube-api-access-drrl5") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "kube-api-access-drrl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.295099 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-pod-info" (OuterVolumeSpecName: "pod-info") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.296451 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wvn8j"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.306847 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wvn8j"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.309181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.309698 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.310182 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.312448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.323501 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372482 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drrl5\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-kube-api-access-drrl5\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372826 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372840 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372928 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372942 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372951 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372962 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.372973 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.377866 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-server-conf" (OuterVolumeSpecName: "server-conf") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.396436 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.403081 4786 scope.go:117] "RemoveContainer" containerID="6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943" Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.408528 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943\": container with ID starting with 6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943 not found: ID does not exist" containerID="6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.408592 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943"} err="failed to get container status \"6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943\": rpc error: code = NotFound desc = could not find container \"6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943\": container with ID starting with 6ccf1a18147938cb10ed2d43099049107bf12b3a2a26163c224b3fc03ac27943 not found: ID does not exist" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.408620 4786 scope.go:117] "RemoveContainer" containerID="aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d" Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.411094 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d\": container with ID starting with aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d not found: ID does not exist" containerID="aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.411274 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d"} err="failed to get container status \"aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d\": rpc error: code = NotFound desc = could not find container \"aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d\": container with ID starting with aad32da27eeca2832e556a7ad9088a3f42e21292223533013464ec3d0197ff6d not found: ID does not exist" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.420350 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data" (OuterVolumeSpecName: "config-data") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.425825 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b25a4cb-7b76-4863-9085-67f99d81f569/ovn-northd/0.log" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.425916 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.456654 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "53fea24b-7ca8-4c0a-96d1-458ca1e877a7" (UID: "53fea24b-7ca8-4c0a-96d1-458ca1e877a7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.474971 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.475009 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.475022 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.475032 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53fea24b-7ca8-4c0a-96d1-458ca1e877a7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.578715 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjhfj\" (UniqueName: \"kubernetes.io/projected/2b25a4cb-7b76-4863-9085-67f99d81f569-kube-api-access-bjhfj\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.578763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-metrics-certs-tls-certs\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.578836 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-northd-tls-certs\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.578903 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-combined-ca-bundle\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.578932 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-rundir\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.578960 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-config\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.579097 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-scripts\") pod \"2b25a4cb-7b76-4863-9085-67f99d81f569\" (UID: \"2b25a4cb-7b76-4863-9085-67f99d81f569\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.581511 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-scripts" (OuterVolumeSpecName: "scripts") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.582172 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.582776 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-config" (OuterVolumeSpecName: "config") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.597029 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b25a4cb-7b76-4863-9085-67f99d81f569-kube-api-access-bjhfj" (OuterVolumeSpecName: "kube-api-access-bjhfj") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "kube-api-access-bjhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.629126 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.655372 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.664968 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "2b25a4cb-7b76-4863-9085-67f99d81f569" (UID: "2b25a4cb-7b76-4863-9085-67f99d81f569"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681423 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681466 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b25a4cb-7b76-4863-9085-67f99d81f569-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681481 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjhfj\" (UniqueName: \"kubernetes.io/projected/2b25a4cb-7b76-4863-9085-67f99d81f569-kube-api-access-bjhfj\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681494 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681507 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681519 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b25a4cb-7b76-4863-9085-67f99d81f569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.681531 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b25a4cb-7b76-4863-9085-67f99d81f569-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.760617 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.782282 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-combined-ca-bundle\") pod \"b488d3ce-635a-4279-a05e-fba3b6599bda\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.782515 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-config-data\") pod \"b488d3ce-635a-4279-a05e-fba3b6599bda\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.782590 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcj2b\" (UniqueName: \"kubernetes.io/projected/b488d3ce-635a-4279-a05e-fba3b6599bda-kube-api-access-bcj2b\") pod \"b488d3ce-635a-4279-a05e-fba3b6599bda\" (UID: \"b488d3ce-635a-4279-a05e-fba3b6599bda\") " Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.787526 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b488d3ce-635a-4279-a05e-fba3b6599bda-kube-api-access-bcj2b" (OuterVolumeSpecName: "kube-api-access-bcj2b") pod "b488d3ce-635a-4279-a05e-fba3b6599bda" (UID: "b488d3ce-635a-4279-a05e-fba3b6599bda"). InnerVolumeSpecName "kube-api-access-bcj2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.807972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b488d3ce-635a-4279-a05e-fba3b6599bda" (UID: "b488d3ce-635a-4279-a05e-fba3b6599bda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.809026 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-config-data" (OuterVolumeSpecName: "config-data") pod "b488d3ce-635a-4279-a05e-fba3b6599bda" (UID: "b488d3ce-635a-4279-a05e-fba3b6599bda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.884757 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.884791 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcj2b\" (UniqueName: \"kubernetes.io/projected/b488d3ce-635a-4279-a05e-fba3b6599bda-kube-api-access-bcj2b\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.884801 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b488d3ce-635a-4279-a05e-fba3b6599bda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.922694 4786 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 13 12:13:24 crc kubenswrapper[4786]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-13T12:13:17Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 13 12:13:24 crc kubenswrapper[4786]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 13 12:13:24 crc kubenswrapper[4786]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-666xn" message=< Mar 13 12:13:24 crc kubenswrapper[4786]: Exiting ovn-controller (1) [FAILED] Mar 13 12:13:24 crc kubenswrapper[4786]: Killing ovn-controller (1) [ OK ] Mar 13 12:13:24 crc kubenswrapper[4786]: Killing ovn-controller (1) with SIGKILL [ OK ] Mar 13 12:13:24 crc kubenswrapper[4786]: 2026-03-13T12:13:17Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 13 12:13:24 crc kubenswrapper[4786]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 13 12:13:24 crc kubenswrapper[4786]: > Mar 13 12:13:24 crc kubenswrapper[4786]: E0313 12:13:24.922750 4786 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 13 12:13:24 crc kubenswrapper[4786]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-13T12:13:17Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 13 12:13:24 crc kubenswrapper[4786]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 13 12:13:24 crc kubenswrapper[4786]: > pod="openstack/ovn-controller-666xn" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" containerID="cri-o://15a58925a001b150b8aba5de1a05d26b0e8b136642a71e1d37e08618e19f5026" Mar 13 12:13:24 crc kubenswrapper[4786]: I0313 12:13:24.922805 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-666xn" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" containerID="cri-o://15a58925a001b150b8aba5de1a05d26b0e8b136642a71e1d37e08618e19f5026" gracePeriod=21 Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.020273 4786 generic.go:334] "Generic (PLEG): container finished" podID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" exitCode=0 Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.020343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b488d3ce-635a-4279-a05e-fba3b6599bda","Type":"ContainerDied","Data":"4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c"} Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.020375 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b488d3ce-635a-4279-a05e-fba3b6599bda","Type":"ContainerDied","Data":"cde4f39690d9f0257866837da4dfee74e2cf3f24aec6115a79c278edf3ba33bc"} Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.020391 4786 scope.go:117] "RemoveContainer" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.020492 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.028764 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-666xn_b03b506e-7150-4904-b58b-8e442885af50/ovn-controller/0.log" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.028797 4786 generic.go:334] "Generic (PLEG): container finished" podID="b03b506e-7150-4904-b58b-8e442885af50" containerID="15a58925a001b150b8aba5de1a05d26b0e8b136642a71e1d37e08618e19f5026" exitCode=137 Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.028838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn" event={"ID":"b03b506e-7150-4904-b58b-8e442885af50","Type":"ContainerDied","Data":"15a58925a001b150b8aba5de1a05d26b0e8b136642a71e1d37e08618e19f5026"} Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.030687 4786 generic.go:334] "Generic (PLEG): container finished" podID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerID="72ec4a750cf5f8f8444ba20cf0c7ee683c4b7001b32595a47908763487ea5853" exitCode=0 Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.030733 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441","Type":"ContainerDied","Data":"72ec4a750cf5f8f8444ba20cf0c7ee683c4b7001b32595a47908763487ea5853"} Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.036675 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b25a4cb-7b76-4863-9085-67f99d81f569/ovn-northd/0.log" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.036836 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" exitCode=139 Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.036941 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.037055 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b25a4cb-7b76-4863-9085-67f99d81f569","Type":"ContainerDied","Data":"73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244"} Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.037136 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b25a4cb-7b76-4863-9085-67f99d81f569","Type":"ContainerDied","Data":"c4b3914edbaf72edb7abfa866ecf88fb7934c0dc501c7ee79e9e90bf44838b9f"} Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.051777 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.350346 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.350541 4786 scope.go:117] "RemoveContainer" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.356053 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.363020 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:13:25 crc kubenswrapper[4786]: E0313 12:13:25.363604 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c\": container with ID starting with 4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c not found: ID does not exist" containerID="4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.363634 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c"} err="failed to get container status \"4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c\": rpc error: code = NotFound desc = could not find container \"4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c\": container with ID starting with 4f63c17f45f498ce41376edff72efaafba84bc7240db02ddc207bbb409c7400c not found: ID does not exist" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.363656 4786 scope.go:117] "RemoveContainer" containerID="9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.367984 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.384097 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.390257 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-666xn_b03b506e-7150-4904-b58b-8e442885af50/ovn-controller/0.log" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.390364 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.398377 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.424481 4786 scope.go:117] "RemoveContainer" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.452405 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" path="/var/lib/kubelet/pods/2b25a4cb-7b76-4863-9085-67f99d81f569/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.453284 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" path="/var/lib/kubelet/pods/3b196d91-2a1f-4ee5-81d5-0133f2917cc5/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.454470 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" path="/var/lib/kubelet/pods/4ab23f85-03a5-4df3-bfa8-da6f748f44e3/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.455318 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" path="/var/lib/kubelet/pods/53fea24b-7ca8-4c0a-96d1-458ca1e877a7/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.455895 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" path="/var/lib/kubelet/pods/b488d3ce-635a-4279-a05e-fba3b6599bda/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.456712 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0510911-6cff-44a7-be99-81a055f7197a" path="/var/lib/kubelet/pods/d0510911-6cff-44a7-be99-81a055f7197a/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.457067 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4f5bdf5-c352-4722-bcbd-704965ab36f0" path="/var/lib/kubelet/pods/e4f5bdf5-c352-4722-bcbd-704965ab36f0/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.457494 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0" path="/var/lib/kubelet/pods/fc63ed5a-1b9a-4d6d-bcec-8f7b282642e0/volumes" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.465547 4786 scope.go:117] "RemoveContainer" containerID="9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc" Mar 13 12:13:25 crc kubenswrapper[4786]: E0313 12:13:25.466003 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc\": container with ID starting with 9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc not found: ID does not exist" containerID="9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.466028 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc"} err="failed to get container status \"9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc\": rpc error: code = NotFound desc = could not find container \"9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc\": container with ID starting with 9e22fa17aac9beb4504d374534f103c8784bbaa9207d8a0580a6f761e87cd6dc not found: ID does not exist" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.466045 4786 scope.go:117] "RemoveContainer" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" Mar 13 12:13:25 crc kubenswrapper[4786]: E0313 12:13:25.466684 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244\": container with ID starting with 73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244 not found: ID does not exist" containerID="73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.466700 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244"} err="failed to get container status \"73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244\": rpc error: code = NotFound desc = could not find container \"73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244\": container with ID starting with 73e6ea0904dcf7ffa74da37b74ff5b9465b7052667c8b22d8007ccfb367a7244 not found: ID does not exist" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-ovn-controller-tls-certs\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496538 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-log-ovn\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496565 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-combined-ca-bundle\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496783 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b03b506e-7150-4904-b58b-8e442885af50-scripts\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496806 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run-ovn\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496874 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwx8k\" (UniqueName: \"kubernetes.io/projected/b03b506e-7150-4904-b58b-8e442885af50-kube-api-access-hwx8k\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.496996 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run\") pod \"b03b506e-7150-4904-b58b-8e442885af50\" (UID: \"b03b506e-7150-4904-b58b-8e442885af50\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.499306 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.499346 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03b506e-7150-4904-b58b-8e442885af50-scripts" (OuterVolumeSpecName: "scripts") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.500078 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run" (OuterVolumeSpecName: "var-run") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.500115 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.517326 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03b506e-7150-4904-b58b-8e442885af50-kube-api-access-hwx8k" (OuterVolumeSpecName: "kube-api-access-hwx8k") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "kube-api-access-hwx8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.525289 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.561810 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "b03b506e-7150-4904-b58b-8e442885af50" (UID: "b03b506e-7150-4904-b58b-8e442885af50"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.572266 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.574282 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.579371 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-combined-ca-bundle\") pod \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598648 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/258afae9-f870-4f49-8102-3f987302da26-config-data-generated\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598672 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-config-data-default\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598697 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-galera-tls-certs\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598729 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts\") pod \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598767 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data\") pod \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pq6l\" (UniqueName: \"kubernetes.io/projected/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-kube-api-access-7pq6l\") pod \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598829 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7nxn\" (UniqueName: \"kubernetes.io/projected/258afae9-f870-4f49-8102-3f987302da26-kube-api-access-t7nxn\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598856 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-operator-scripts\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598954 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn9jp\" (UniqueName: \"kubernetes.io/projected/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-kube-api-access-tn9jp\") pod \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.598993 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-kolla-config\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599017 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-config-data\") pod \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599042 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-combined-ca-bundle\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599067 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-combined-ca-bundle\") pod \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\" (UID: \"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"258afae9-f870-4f49-8102-3f987302da26\" (UID: \"258afae9-f870-4f49-8102-3f987302da26\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599128 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-etc-machine-id\") pod \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom\") pod \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\" (UID: \"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441\") " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599396 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwx8k\" (UniqueName: \"kubernetes.io/projected/b03b506e-7150-4904-b58b-8e442885af50-kube-api-access-hwx8k\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599411 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599424 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599434 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599446 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03b506e-7150-4904-b58b-8e442885af50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599457 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b03b506e-7150-4904-b58b-8e442885af50-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599470 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b03b506e-7150-4904-b58b-8e442885af50-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.599821 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.600423 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.600929 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258afae9-f870-4f49-8102-3f987302da26-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.601655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.601927 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.642947 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts" (OuterVolumeSpecName: "scripts") pod "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.643340 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.643987 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-kube-api-access-tn9jp" (OuterVolumeSpecName: "kube-api-access-tn9jp") pod "612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" (UID: "612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659"). InnerVolumeSpecName "kube-api-access-tn9jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.644041 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258afae9-f870-4f49-8102-3f987302da26-kube-api-access-t7nxn" (OuterVolumeSpecName: "kube-api-access-t7nxn") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "kube-api-access-t7nxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.651752 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-config-data" (OuterVolumeSpecName: "config-data") pod "612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" (UID: "612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.662552 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.662665 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-kube-api-access-7pq6l" (OuterVolumeSpecName: "kube-api-access-7pq6l") pod "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441"). InnerVolumeSpecName "kube-api-access-7pq6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.674341 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.686707 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" (UID: "612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.698064 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.702946 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pq6l\" (UniqueName: \"kubernetes.io/projected/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-kube-api-access-7pq6l\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703149 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7nxn\" (UniqueName: \"kubernetes.io/projected/258afae9-f870-4f49-8102-3f987302da26-kube-api-access-t7nxn\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703231 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703346 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn9jp\" (UniqueName: \"kubernetes.io/projected/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-kube-api-access-tn9jp\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703464 4786 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703523 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703576 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703638 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.703749 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.705018 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.705055 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.705065 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.705076 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/258afae9-f870-4f49-8102-3f987302da26-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.705085 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/258afae9-f870-4f49-8102-3f987302da26-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.705095 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.732025 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "258afae9-f870-4f49-8102-3f987302da26" (UID: "258afae9-f870-4f49-8102-3f987302da26"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.736647 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.737621 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data" (OuterVolumeSpecName: "config-data") pod "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" (UID: "c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.806534 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.806576 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.806589 4786 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/258afae9-f870-4f49-8102-3f987302da26-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:25 crc kubenswrapper[4786]: I0313 12:13:25.944315 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008614 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-scripts\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008688 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-combined-ca-bundle\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-fernet-keys\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008730 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvhsp\" (UniqueName: \"kubernetes.io/projected/c03ed618-9a09-48b0-84d4-873357872d22-kube-api-access-gvhsp\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008768 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-config-data\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008827 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-public-tls-certs\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008877 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-internal-tls-certs\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.008976 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-credential-keys\") pod \"c03ed618-9a09-48b0-84d4-873357872d22\" (UID: \"c03ed618-9a09-48b0-84d4-873357872d22\") " Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.011752 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-scripts" (OuterVolumeSpecName: "scripts") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.012531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.014224 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.019732 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ed618-9a09-48b0-84d4-873357872d22-kube-api-access-gvhsp" (OuterVolumeSpecName: "kube-api-access-gvhsp") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "kube-api-access-gvhsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.031192 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-config-data" (OuterVolumeSpecName: "config-data") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.031857 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.044234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.053604 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c03ed618-9a09-48b0-84d4-873357872d22" (UID: "c03ed618-9a09-48b0-84d4-873357872d22"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.059746 4786 generic.go:334] "Generic (PLEG): container finished" podID="c03ed618-9a09-48b0-84d4-873357872d22" containerID="495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc" exitCode=0 Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.059800 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7547f9f8-fzqlk" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.059800 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7547f9f8-fzqlk" event={"ID":"c03ed618-9a09-48b0-84d4-873357872d22","Type":"ContainerDied","Data":"495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.059935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7547f9f8-fzqlk" event={"ID":"c03ed618-9a09-48b0-84d4-873357872d22","Type":"ContainerDied","Data":"559eb214a9ad3cbb50ac47afa586137df65f794c357381a4dc40c61df0bf8e84"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.059967 4786 scope.go:117] "RemoveContainer" containerID="495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.065706 4786 generic.go:334] "Generic (PLEG): container finished" podID="258afae9-f870-4f49-8102-3f987302da26" containerID="71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f" exitCode=0 Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.065795 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"258afae9-f870-4f49-8102-3f987302da26","Type":"ContainerDied","Data":"71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.065964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"258afae9-f870-4f49-8102-3f987302da26","Type":"ContainerDied","Data":"2f3ff7be82f53ef853033fb0112d325d3a9220de25c8891d5743a1aab4438220"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.066018 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.068205 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-666xn_b03b506e-7150-4904-b58b-8e442885af50/ovn-controller/0.log" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.068314 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666xn" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.069433 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666xn" event={"ID":"b03b506e-7150-4904-b58b-8e442885af50","Type":"ContainerDied","Data":"30776095502a0329caf38ce2f83b6444380f242d9cdfaa99776a8ae91524b0b4"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.075650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441","Type":"ContainerDied","Data":"fd51654ec66219ac04ac6664462e54d989bceffe704584118f28e4259e79ad7e"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.075690 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.077383 4786 generic.go:334] "Generic (PLEG): container finished" podID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" exitCode=0 Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.077447 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.077459 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659","Type":"ContainerDied","Data":"baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.077484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659","Type":"ContainerDied","Data":"5aa3f69b0300a396468330831e52f26e3a75fe7a71548e3cfc302e42bc91bfcc"} Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111208 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111234 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111245 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111255 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111266 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111277 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvhsp\" (UniqueName: \"kubernetes.io/projected/c03ed618-9a09-48b0-84d4-873357872d22-kube-api-access-gvhsp\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111290 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.111301 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03ed618-9a09-48b0-84d4-873357872d22-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.119659 4786 scope.go:117] "RemoveContainer" containerID="495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc" Mar 13 12:13:26 crc kubenswrapper[4786]: E0313 12:13:26.123115 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc\": container with ID starting with 495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc not found: ID does not exist" containerID="495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.123177 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc"} err="failed to get container status \"495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc\": rpc error: code = NotFound desc = could not find container \"495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc\": container with ID starting with 495eb9955e5a98c6bd035793e1af020bd4700db354d6f9650d432b0a6bcc0ccc not found: ID does not exist" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.123208 4786 scope.go:117] "RemoveContainer" containerID="71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.127379 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d7547f9f8-fzqlk"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.138162 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d7547f9f8-fzqlk"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.156380 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.170830 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.171061 4786 scope.go:117] "RemoveContainer" containerID="22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.176405 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-666xn"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.184973 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-666xn"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.190106 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.195656 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.215987 4786 scope.go:117] "RemoveContainer" containerID="71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f" Mar 13 12:13:26 crc kubenswrapper[4786]: E0313 12:13:26.216650 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f\": container with ID starting with 71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f not found: ID does not exist" containerID="71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.216693 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f"} err="failed to get container status \"71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f\": rpc error: code = NotFound desc = could not find container \"71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f\": container with ID starting with 71def6827d325804c6395fd12688ce27b18a7cc743f71b1543da005733fcf17f not found: ID does not exist" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.216724 4786 scope.go:117] "RemoveContainer" containerID="22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909" Mar 13 12:13:26 crc kubenswrapper[4786]: E0313 12:13:26.218660 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909\": container with ID starting with 22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909 not found: ID does not exist" containerID="22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.218952 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909"} err="failed to get container status \"22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909\": rpc error: code = NotFound desc = could not find container \"22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909\": container with ID starting with 22d00cb7b154a4091b441a87f0174b2265a7999858f5a9855ae5651e3b9dd909 not found: ID does not exist" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.219085 4786 scope.go:117] "RemoveContainer" containerID="15a58925a001b150b8aba5de1a05d26b0e8b136642a71e1d37e08618e19f5026" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.224084 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.231015 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.245595 4786 scope.go:117] "RemoveContainer" containerID="f8c4849551e632909c931cb9c230d8766911210a5e2a92f2d4a1214b86907ca7" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.264993 4786 scope.go:117] "RemoveContainer" containerID="72ec4a750cf5f8f8444ba20cf0c7ee683c4b7001b32595a47908763487ea5853" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.283846 4786 scope.go:117] "RemoveContainer" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.303014 4786 scope.go:117] "RemoveContainer" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" Mar 13 12:13:26 crc kubenswrapper[4786]: E0313 12:13:26.303393 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86\": container with ID starting with baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86 not found: ID does not exist" containerID="baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86" Mar 13 12:13:26 crc kubenswrapper[4786]: I0313 12:13:26.303429 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86"} err="failed to get container status \"baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86\": rpc error: code = NotFound desc = could not find container \"baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86\": container with ID starting with baa7eb84b8f7c693307c83de708e662775050d3ce67ec9e1b7f45e67c56e0c86 not found: ID does not exist" Mar 13 12:13:27 crc kubenswrapper[4786]: I0313 12:13:27.451806 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258afae9-f870-4f49-8102-3f987302da26" path="/var/lib/kubelet/pods/258afae9-f870-4f49-8102-3f987302da26/volumes" Mar 13 12:13:27 crc kubenswrapper[4786]: I0313 12:13:27.452773 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" path="/var/lib/kubelet/pods/612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659/volumes" Mar 13 12:13:27 crc kubenswrapper[4786]: I0313 12:13:27.453847 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03b506e-7150-4904-b58b-8e442885af50" path="/var/lib/kubelet/pods/b03b506e-7150-4904-b58b-8e442885af50/volumes" Mar 13 12:13:27 crc kubenswrapper[4786]: I0313 12:13:27.454455 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ed618-9a09-48b0-84d4-873357872d22" path="/var/lib/kubelet/pods/c03ed618-9a09-48b0-84d4-873357872d22/volumes" Mar 13 12:13:27 crc kubenswrapper[4786]: I0313 12:13:27.454958 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" path="/var/lib/kubelet/pods/c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441/volumes" Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.205546 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.206060 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.206212 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.206568 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.206605 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.207372 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.208383 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:28 crc kubenswrapper[4786]: E0313 12:13:28.208415 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.127060 4786 generic.go:334] "Generic (PLEG): container finished" podID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerID="c90607a25b6719b906805f4956767bf9e8f2062f95bfc51dac8f6059d27ae384" exitCode=0 Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.127129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerDied","Data":"c90607a25b6719b906805f4956767bf9e8f2062f95bfc51dac8f6059d27ae384"} Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.127445 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fcb8b06-7f98-4c8b-bae2-1bf657791194","Type":"ContainerDied","Data":"71453dabac256580b94aacf743b7f04cfc1627ce80961f0084ce61ab9770472e"} Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.127473 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71453dabac256580b94aacf743b7f04cfc1627ce80961f0084ce61ab9770472e" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.159752 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.161786 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwpl9\" (UniqueName: \"kubernetes.io/projected/1fcb8b06-7f98-4c8b-bae2-1bf657791194-kube-api-access-xwpl9\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.162064 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-sg-core-conf-yaml\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.162152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-log-httpd\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.162227 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-config-data\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.162419 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-combined-ca-bundle\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.163330 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.165255 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-ceilometer-tls-certs\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.165312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-scripts\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.165402 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-run-httpd\") pod \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\" (UID: \"1fcb8b06-7f98-4c8b-bae2-1bf657791194\") " Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.166095 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.166496 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.171205 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fcb8b06-7f98-4c8b-bae2-1bf657791194-kube-api-access-xwpl9" (OuterVolumeSpecName: "kube-api-access-xwpl9") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "kube-api-access-xwpl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.171224 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-scripts" (OuterVolumeSpecName: "scripts") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.225191 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.251239 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.267026 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.267064 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.267075 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fcb8b06-7f98-4c8b-bae2-1bf657791194-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.267089 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwpl9\" (UniqueName: \"kubernetes.io/projected/1fcb8b06-7f98-4c8b-bae2-1bf657791194-kube-api-access-xwpl9\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.267099 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.277450 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-config-data" (OuterVolumeSpecName: "config-data") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.278305 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fcb8b06-7f98-4c8b-bae2-1bf657791194" (UID: "1fcb8b06-7f98-4c8b-bae2-1bf657791194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.368122 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:29 crc kubenswrapper[4786]: I0313 12:13:29.368419 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcb8b06-7f98-4c8b-bae2-1bf657791194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:30 crc kubenswrapper[4786]: I0313 12:13:30.134382 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:13:30 crc kubenswrapper[4786]: I0313 12:13:30.161509 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:13:30 crc kubenswrapper[4786]: I0313 12:13:30.167635 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:13:31 crc kubenswrapper[4786]: I0313 12:13:31.448980 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" path="/var/lib/kubelet/pods/1fcb8b06-7f98-4c8b-bae2-1bf657791194/volumes" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.146920 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.149839 4786 generic.go:334] "Generic (PLEG): container finished" podID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerID="1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555" exitCode=0 Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.149911 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6577bdf497-p2bmr" event={"ID":"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c","Type":"ContainerDied","Data":"1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555"} Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.149942 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6577bdf497-p2bmr" event={"ID":"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c","Type":"ContainerDied","Data":"776ca0d624d9affa9d3bae1a7df7572c7458baa1c7d6fdfaffccc4d21f325dde"} Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.149960 4786 scope.go:117] "RemoveContainer" containerID="617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.186827 4786 scope.go:117] "RemoveContainer" containerID="1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-combined-ca-bundle\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211397 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-public-tls-certs\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211482 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-internal-tls-certs\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211532 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqdqz\" (UniqueName: \"kubernetes.io/projected/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-kube-api-access-tqdqz\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-config\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211655 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-ovndb-tls-certs\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.211701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-httpd-config\") pod \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\" (UID: \"0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c\") " Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.218055 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.220004 4786 scope.go:117] "RemoveContainer" containerID="617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd" Mar 13 12:13:32 crc kubenswrapper[4786]: E0313 12:13:32.221171 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd\": container with ID starting with 617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd not found: ID does not exist" containerID="617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.221210 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd"} err="failed to get container status \"617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd\": rpc error: code = NotFound desc = could not find container \"617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd\": container with ID starting with 617524d16ce2268f6c0317d24142e06d1a0de40b90344e4f913506eae744c8cd not found: ID does not exist" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.221236 4786 scope.go:117] "RemoveContainer" containerID="1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555" Mar 13 12:13:32 crc kubenswrapper[4786]: E0313 12:13:32.221956 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555\": container with ID starting with 1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555 not found: ID does not exist" containerID="1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.221992 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555"} err="failed to get container status \"1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555\": rpc error: code = NotFound desc = could not find container \"1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555\": container with ID starting with 1fbc133bcba532e17f9ecab2c252737c6c4bb9a79dd887ca246741468dd03555 not found: ID does not exist" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.230083 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-kube-api-access-tqdqz" (OuterVolumeSpecName: "kube-api-access-tqdqz") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "kube-api-access-tqdqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.253576 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.257746 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.259723 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-config" (OuterVolumeSpecName: "config") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.273253 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.278872 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" (UID: "0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312638 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqdqz\" (UniqueName: \"kubernetes.io/projected/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-kube-api-access-tqdqz\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312674 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312686 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312694 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312703 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312711 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:32 crc kubenswrapper[4786]: I0313 12:13:32.312719 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:33 crc kubenswrapper[4786]: I0313 12:13:33.160932 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6577bdf497-p2bmr" Mar 13 12:13:33 crc kubenswrapper[4786]: I0313 12:13:33.200911 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6577bdf497-p2bmr"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.204324 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.204782 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.205080 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.205123 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:13:33 crc kubenswrapper[4786]: I0313 12:13:33.207749 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6577bdf497-p2bmr"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.208174 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.211517 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.213284 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.213333 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:13:33 crc kubenswrapper[4786]: E0313 12:13:33.297243 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0813a8e1_e94c_43ed_a0d9_fd3fdcb6660c.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:13:33 crc kubenswrapper[4786]: I0313 12:13:33.448582 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" path="/var/lib/kubelet/pods/0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c/volumes" Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.204778 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.205540 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.205949 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.205970 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.206708 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.207819 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.208809 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:38 crc kubenswrapper[4786]: E0313 12:13:38.208865 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.204812 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.205533 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.205995 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.206026 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.206059 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.207833 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.209742 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 12:13:43 crc kubenswrapper[4786]: E0313 12:13:43.209782 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tpch6" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.244238 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpch6_187d55eb-db2f-4935-91cc-8ef51895a35a/ovs-vswitchd/0.log" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.245994 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.246057 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.310580 4786 generic.go:334] "Generic (PLEG): container finished" podID="acba774d-de43-4651-a5f0-95875154afad" containerID="d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b" exitCode=137 Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.310689 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b"} Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.310721 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acba774d-de43-4651-a5f0-95875154afad","Type":"ContainerDied","Data":"3c393d37dd069e468a39269dad91c2bbb9381829f40438afa1079f94066daae8"} Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.310720 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.310743 4786 scope.go:117] "RemoveContainer" containerID="d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.315498 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpch6_187d55eb-db2f-4935-91cc-8ef51895a35a/ovs-vswitchd/0.log" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.317538 4786 generic.go:334] "Generic (PLEG): container finished" podID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" exitCode=137 Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.317590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerDied","Data":"57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3"} Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.317624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpch6" event={"ID":"187d55eb-db2f-4935-91cc-8ef51895a35a","Type":"ContainerDied","Data":"051b19402dfadd324f40642cd77d4b6f14e07391e2fd2b7bc31fc5f9090c7ee7"} Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.317697 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tpch6" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.337924 4786 scope.go:117] "RemoveContainer" containerID="1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.363165 4786 scope.go:117] "RemoveContainer" containerID="3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.364896 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-run\") pod \"187d55eb-db2f-4935-91cc-8ef51895a35a\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.364956 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") pod \"acba774d-de43-4651-a5f0-95875154afad\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365023 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-lock\") pod \"acba774d-de43-4651-a5f0-95875154afad\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-lib\") pod \"187d55eb-db2f-4935-91cc-8ef51895a35a\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8fnd\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-kube-api-access-g8fnd\") pod \"acba774d-de43-4651-a5f0-95875154afad\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365148 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-cache\") pod \"acba774d-de43-4651-a5f0-95875154afad\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365189 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/187d55eb-db2f-4935-91cc-8ef51895a35a-scripts\") pod \"187d55eb-db2f-4935-91cc-8ef51895a35a\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365217 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-log\") pod \"187d55eb-db2f-4935-91cc-8ef51895a35a\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"acba774d-de43-4651-a5f0-95875154afad\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365306 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acba774d-de43-4651-a5f0-95875154afad-combined-ca-bundle\") pod \"acba774d-de43-4651-a5f0-95875154afad\" (UID: \"acba774d-de43-4651-a5f0-95875154afad\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365344 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/187d55eb-db2f-4935-91cc-8ef51895a35a-kube-api-access-pgk4j\") pod \"187d55eb-db2f-4935-91cc-8ef51895a35a\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365372 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-etc-ovs\") pod \"187d55eb-db2f-4935-91cc-8ef51895a35a\" (UID: \"187d55eb-db2f-4935-91cc-8ef51895a35a\") " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365865 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "187d55eb-db2f-4935-91cc-8ef51895a35a" (UID: "187d55eb-db2f-4935-91cc-8ef51895a35a"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.368712 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-log" (OuterVolumeSpecName: "var-log") pod "187d55eb-db2f-4935-91cc-8ef51895a35a" (UID: "187d55eb-db2f-4935-91cc-8ef51895a35a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.365954 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-run" (OuterVolumeSpecName: "var-run") pod "187d55eb-db2f-4935-91cc-8ef51895a35a" (UID: "187d55eb-db2f-4935-91cc-8ef51895a35a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.367114 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187d55eb-db2f-4935-91cc-8ef51895a35a-scripts" (OuterVolumeSpecName: "scripts") pod "187d55eb-db2f-4935-91cc-8ef51895a35a" (UID: "187d55eb-db2f-4935-91cc-8ef51895a35a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.368359 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-lock" (OuterVolumeSpecName: "lock") pod "acba774d-de43-4651-a5f0-95875154afad" (UID: "acba774d-de43-4651-a5f0-95875154afad"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.368376 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-lib" (OuterVolumeSpecName: "var-lib") pod "187d55eb-db2f-4935-91cc-8ef51895a35a" (UID: "187d55eb-db2f-4935-91cc-8ef51895a35a"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.370663 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-cache" (OuterVolumeSpecName: "cache") pod "acba774d-de43-4651-a5f0-95875154afad" (UID: "acba774d-de43-4651-a5f0-95875154afad"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.371617 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "acba774d-de43-4651-a5f0-95875154afad" (UID: "acba774d-de43-4651-a5f0-95875154afad"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.371870 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "acba774d-de43-4651-a5f0-95875154afad" (UID: "acba774d-de43-4651-a5f0-95875154afad"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.376277 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187d55eb-db2f-4935-91cc-8ef51895a35a-kube-api-access-pgk4j" (OuterVolumeSpecName: "kube-api-access-pgk4j") pod "187d55eb-db2f-4935-91cc-8ef51895a35a" (UID: "187d55eb-db2f-4935-91cc-8ef51895a35a"). InnerVolumeSpecName "kube-api-access-pgk4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.377538 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-kube-api-access-g8fnd" (OuterVolumeSpecName: "kube-api-access-g8fnd") pod "acba774d-de43-4651-a5f0-95875154afad" (UID: "acba774d-de43-4651-a5f0-95875154afad"). InnerVolumeSpecName "kube-api-access-g8fnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.424828 4786 scope.go:117] "RemoveContainer" containerID="1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.448210 4786 scope.go:117] "RemoveContainer" containerID="80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.466643 4786 scope.go:117] "RemoveContainer" containerID="3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467784 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/187d55eb-db2f-4935-91cc-8ef51895a35a-kube-api-access-pgk4j\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467816 4786 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467826 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467835 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467844 4786 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-lock\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467852 4786 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-lib\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467860 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8fnd\" (UniqueName: \"kubernetes.io/projected/acba774d-de43-4651-a5f0-95875154afad-kube-api-access-g8fnd\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467867 4786 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acba774d-de43-4651-a5f0-95875154afad-cache\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467875 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/187d55eb-db2f-4935-91cc-8ef51895a35a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467909 4786 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/187d55eb-db2f-4935-91cc-8ef51895a35a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.467946 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.481710 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.484144 4786 scope.go:117] "RemoveContainer" containerID="e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.515125 4786 scope.go:117] "RemoveContainer" containerID="a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.535278 4786 scope.go:117] "RemoveContainer" containerID="18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.553935 4786 scope.go:117] "RemoveContainer" containerID="cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.569264 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.569534 4786 scope.go:117] "RemoveContainer" containerID="7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.588812 4786 scope.go:117] "RemoveContainer" containerID="bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.605568 4786 scope.go:117] "RemoveContainer" containerID="82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.622027 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acba774d-de43-4651-a5f0-95875154afad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acba774d-de43-4651-a5f0-95875154afad" (UID: "acba774d-de43-4651-a5f0-95875154afad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.626473 4786 scope.go:117] "RemoveContainer" containerID="090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.648619 4786 scope.go:117] "RemoveContainer" containerID="8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.649610 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tpch6"] Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.656633 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tpch6"] Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.670903 4786 scope.go:117] "RemoveContainer" containerID="d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.670906 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acba774d-de43-4651-a5f0-95875154afad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.671754 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b\": container with ID starting with d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b not found: ID does not exist" containerID="d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.671793 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b"} err="failed to get container status \"d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b\": rpc error: code = NotFound desc = could not find container \"d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b\": container with ID starting with d1641a4b6cfd3b5cab01dbb3ad92f5c8fba614c8b2686deeef1993deda6a979b not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.671820 4786 scope.go:117] "RemoveContainer" containerID="1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.672417 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640\": container with ID starting with 1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640 not found: ID does not exist" containerID="1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.672482 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640"} err="failed to get container status \"1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640\": rpc error: code = NotFound desc = could not find container \"1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640\": container with ID starting with 1a7aa23a191809f2c9ca6bfaa0c94b2a65c4f98c76ce894868163ca6e5929640 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.672535 4786 scope.go:117] "RemoveContainer" containerID="3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.673073 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f\": container with ID starting with 3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f not found: ID does not exist" containerID="3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.673111 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f"} err="failed to get container status \"3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f\": rpc error: code = NotFound desc = could not find container \"3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f\": container with ID starting with 3e1b00bad5cda71856279a9140c08f92b5620080df91455e35a69e6c59a7080f not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.673129 4786 scope.go:117] "RemoveContainer" containerID="1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.673440 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617\": container with ID starting with 1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617 not found: ID does not exist" containerID="1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.673541 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617"} err="failed to get container status \"1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617\": rpc error: code = NotFound desc = could not find container \"1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617\": container with ID starting with 1ab02972f5a18da405778ed95e793c10b1c55895a701e132c5b6d02973386617 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.673566 4786 scope.go:117] "RemoveContainer" containerID="80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.674147 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f\": container with ID starting with 80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f not found: ID does not exist" containerID="80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.674182 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f"} err="failed to get container status \"80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f\": rpc error: code = NotFound desc = could not find container \"80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f\": container with ID starting with 80731da625e21eb6d0cd197d3bd5efb0a3d17971f7cc4452905dc9996b21f45f not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.674206 4786 scope.go:117] "RemoveContainer" containerID="3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.674743 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227\": container with ID starting with 3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227 not found: ID does not exist" containerID="3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.674778 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227"} err="failed to get container status \"3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227\": rpc error: code = NotFound desc = could not find container \"3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227\": container with ID starting with 3806e296a47f07eee04b317a71a7258f53e81c683186ec6ce0a74f7644ac3227 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.674797 4786 scope.go:117] "RemoveContainer" containerID="e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.675334 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e\": container with ID starting with e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e not found: ID does not exist" containerID="e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.675373 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e"} err="failed to get container status \"e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e\": rpc error: code = NotFound desc = could not find container \"e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e\": container with ID starting with e71e1ef0a50cf17f3f4d836e5e4a257791964eb7df126cc4db579954b5d1fe4e not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.675403 4786 scope.go:117] "RemoveContainer" containerID="a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.675730 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3\": container with ID starting with a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3 not found: ID does not exist" containerID="a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.675773 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3"} err="failed to get container status \"a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3\": rpc error: code = NotFound desc = could not find container \"a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3\": container with ID starting with a005db48b83df546f58b511cb24bbf86bf48e43f8a9630dfd2479b2cc94372b3 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.675800 4786 scope.go:117] "RemoveContainer" containerID="18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.676277 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b\": container with ID starting with 18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b not found: ID does not exist" containerID="18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.676310 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b"} err="failed to get container status \"18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b\": rpc error: code = NotFound desc = could not find container \"18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b\": container with ID starting with 18a667b5befa9c10889240a31a56e250d425885efe3f88731f71e7c3fa3e699b not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.676330 4786 scope.go:117] "RemoveContainer" containerID="cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.676575 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885\": container with ID starting with cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885 not found: ID does not exist" containerID="cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.676616 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885"} err="failed to get container status \"cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885\": rpc error: code = NotFound desc = could not find container \"cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885\": container with ID starting with cad244a77ae0cbf8516072aa5595d19d1c43c2d7294a9efffc375cd22fafa885 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.676789 4786 scope.go:117] "RemoveContainer" containerID="7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.677086 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c\": container with ID starting with 7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c not found: ID does not exist" containerID="7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.677126 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c"} err="failed to get container status \"7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c\": rpc error: code = NotFound desc = could not find container \"7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c\": container with ID starting with 7f07a719b9b80bdb54a3c42dbf5b3108080cbad6e1fc99d80c2aab76eead2f2c not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.677146 4786 scope.go:117] "RemoveContainer" containerID="bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.677583 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f\": container with ID starting with bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f not found: ID does not exist" containerID="bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.677628 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f"} err="failed to get container status \"bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f\": rpc error: code = NotFound desc = could not find container \"bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f\": container with ID starting with bc8b5032f4db6703ac9b08c03efbd2de9c7a62a76ff2b5d09ee2f61ef676664f not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.677658 4786 scope.go:117] "RemoveContainer" containerID="82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.678238 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839\": container with ID starting with 82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839 not found: ID does not exist" containerID="82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.678277 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839"} err="failed to get container status \"82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839\": rpc error: code = NotFound desc = could not find container \"82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839\": container with ID starting with 82c830a70839aa956f6e27edfb1775e82bf0f1e9c8204035a219071457f51839 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.678297 4786 scope.go:117] "RemoveContainer" containerID="090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.678871 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d\": container with ID starting with 090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d not found: ID does not exist" containerID="090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.678942 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d"} err="failed to get container status \"090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d\": rpc error: code = NotFound desc = could not find container \"090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d\": container with ID starting with 090d877ef3407295cfd6ee6a6d8a9800e3ed9c04bcccb8c064d63613624fcc9d not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.679025 4786 scope.go:117] "RemoveContainer" containerID="8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.679347 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3\": container with ID starting with 8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3 not found: ID does not exist" containerID="8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.679394 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3"} err="failed to get container status \"8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3\": rpc error: code = NotFound desc = could not find container \"8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3\": container with ID starting with 8cc4a2f14430bc7db02eed1ca6928bd50d2a213f2b73d307c1ee4e58343183a3 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.679429 4786 scope.go:117] "RemoveContainer" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.739345 4786 scope.go:117] "RemoveContainer" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.761000 4786 scope.go:117] "RemoveContainer" containerID="f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.788735 4786 scope.go:117] "RemoveContainer" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.789205 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3\": container with ID starting with 57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3 not found: ID does not exist" containerID="57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.789240 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3"} err="failed to get container status \"57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3\": rpc error: code = NotFound desc = could not find container \"57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3\": container with ID starting with 57f4daa1a1c5a59127ac868967ac7268b84b413f304a328b2fdfa958f84966f3 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.789272 4786 scope.go:117] "RemoveContainer" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.789604 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd\": container with ID starting with 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd not found: ID does not exist" containerID="7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.789632 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd"} err="failed to get container status \"7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd\": rpc error: code = NotFound desc = could not find container \"7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd\": container with ID starting with 7dcb1cb334ed18c8150ada1dcb74686fe0534cb7b07d0ec66f9fb15373b10ccd not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.789651 4786 scope.go:117] "RemoveContainer" containerID="f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34" Mar 13 12:13:47 crc kubenswrapper[4786]: E0313 12:13:47.789845 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34\": container with ID starting with f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34 not found: ID does not exist" containerID="f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.789868 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34"} err="failed to get container status \"f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34\": rpc error: code = NotFound desc = could not find container \"f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34\": container with ID starting with f1734e1c9a8cdc0e4685c6833d466bfe80a5e4db60cd16e477e683f5347aae34 not found: ID does not exist" Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.944013 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:13:47 crc kubenswrapper[4786]: I0313 12:13:47.949795 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:13:49 crc kubenswrapper[4786]: I0313 12:13:49.455845 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" path="/var/lib/kubelet/pods/187d55eb-db2f-4935-91cc-8ef51895a35a/volumes" Mar 13 12:13:49 crc kubenswrapper[4786]: I0313 12:13:49.457831 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acba774d-de43-4651-a5f0-95875154afad" path="/var/lib/kubelet/pods/acba774d-de43-4651-a5f0-95875154afad/volumes" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.179720 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556734-k9nqw"] Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180622 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="setup-container" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180664 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="setup-container" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180692 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180704 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-api" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180752 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="proxy-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180765 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="proxy-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180777 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="sg-core" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180784 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="sg-core" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180792 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="cinder-scheduler" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180800 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="cinder-scheduler" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180815 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="rsync" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180822 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="rsync" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180834 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180842 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180854 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180862 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api-log" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180873 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180896 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-server" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180907 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180914 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-log" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180928 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-central-agent" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180936 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-central-agent" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180953 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="probe" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180960 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="probe" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180973 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180981 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.180991 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.180999 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-api" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181014 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-updater" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181023 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-updater" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181036 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39720781-e027-4319-9c8f-1d9134d269f8" containerName="kube-state-metrics" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181045 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="39720781-e027-4319-9c8f-1d9134d269f8" containerName="kube-state-metrics" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181055 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" containerName="nova-scheduler-scheduler" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181066 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" containerName="nova-scheduler-scheduler" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181082 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="swift-recon-cron" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181090 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="swift-recon-cron" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181104 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181112 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181124 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181131 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181145 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181153 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-log" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181161 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-expirer" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181168 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-expirer" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181182 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181190 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181204 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-metadata" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181212 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-metadata" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181221 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03ed618-9a09-48b0-84d4-873357872d22" containerName="keystone-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181229 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03ed618-9a09-48b0-84d4-873357872d22" containerName="keystone-api" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181239 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="rabbitmq" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181246 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="rabbitmq" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181256 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181263 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181272 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181279 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181289 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181298 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181305 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181313 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181322 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181330 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181338 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181346 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181358 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f5bdf5-c352-4722-bcbd-704965ab36f0" containerName="memcached" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181365 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f5bdf5-c352-4722-bcbd-704965ab36f0" containerName="memcached" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181374 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="rabbitmq" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181381 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="rabbitmq" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181390 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="openstack-network-exporter" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181397 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="openstack-network-exporter" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181409 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-updater" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181417 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-updater" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181425 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181433 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181444 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="setup-container" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181451 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="setup-container" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181466 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-notification-agent" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181473 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-notification-agent" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181489 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181496 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-log" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181509 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181517 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-server" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181530 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181537 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-server" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181549 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="mysql-bootstrap" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181556 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="mysql-bootstrap" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181570 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server-init" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181578 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server-init" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181589 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerName="nova-cell1-conductor-conductor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181598 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerName="nova-cell1-conductor-conductor" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181610 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="ovn-northd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181617 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="ovn-northd" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181629 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="galera" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181636 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="galera" Mar 13 12:14:00 crc kubenswrapper[4786]: E0313 12:14:00.181648 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-reaper" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181655 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-reaper" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181809 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="proxy-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181826 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-expirer" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181838 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181854 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="sg-core" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181869 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-central-agent" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181900 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181910 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="swift-recon-cron" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181919 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b196d91-2a1f-4ee5-81d5-0133f2917cc5" containerName="rabbitmq" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181928 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-reaper" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181938 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181952 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="39720781-e027-4319-9c8f-1d9134d269f8" containerName="kube-state-metrics" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181960 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="612e2d14-3dcd-4dd6-ad75-e0f9ffb4a659" containerName="nova-scheduler-scheduler" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181973 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181983 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="258afae9-f870-4f49-8102-3f987302da26" containerName="galera" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.181997 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="probe" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182008 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182017 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182030 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovs-vswitchd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182038 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-updater" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182053 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b488d3ce-635a-4279-a05e-fba3b6599bda" containerName="nova-cell1-conductor-conductor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182067 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="187d55eb-db2f-4935-91cc-8ef51895a35a" containerName="ovsdb-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182080 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182091 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182099 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="openstack-network-exporter" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182108 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="53fea24b-7ca8-4c0a-96d1-458ca1e877a7" containerName="rabbitmq" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182120 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="rsync" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182144 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b25a4cb-7b76-4863-9085-67f99d81f569" containerName="ovn-northd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182155 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="124c632a-4ff3-419c-9e26-ba68929feeb7" containerName="barbican-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182165 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4f5bdf5-c352-4722-bcbd-704965ab36f0" containerName="memcached" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182178 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182188 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a06f1e9-ddda-42a5-ab33-88473c56a6c7" containerName="nova-metadata-metadata" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182198 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c067e1cd-e5a9-413d-9ddc-4e1f4a6a0441" containerName="cinder-scheduler" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182207 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="object-auditor" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182215 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182224 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-log" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182233 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03ed618-9a09-48b0-84d4-873357872d22" containerName="keystone-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182245 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-server" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182257 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67ad69d-5191-4d93-9326-b93b0653a82c" containerName="nova-api-api" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182270 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03b506e-7150-4904-b58b-8e442885af50" containerName="ovn-controller" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182280 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="container-updater" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182290 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba774d-de43-4651-a5f0-95875154afad" containerName="account-replicator" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182301 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb8b06-7f98-4c8b-bae2-1bf657791194" containerName="ceilometer-notification-agent" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182309 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0813a8e1-e94c-43ed-a0d9-fd3fdcb6660c" containerName="neutron-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182316 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab23f85-03a5-4df3-bfa8-da6f748f44e3" containerName="glance-httpd" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.182825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.187486 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.187742 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.187930 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.192275 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-k9nqw"] Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.359390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97xwt\" (UniqueName: \"kubernetes.io/projected/be27b311-0b0d-4861-85e4-48166da8614c-kube-api-access-97xwt\") pod \"auto-csr-approver-29556734-k9nqw\" (UID: \"be27b311-0b0d-4861-85e4-48166da8614c\") " pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.461620 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97xwt\" (UniqueName: \"kubernetes.io/projected/be27b311-0b0d-4861-85e4-48166da8614c-kube-api-access-97xwt\") pod \"auto-csr-approver-29556734-k9nqw\" (UID: \"be27b311-0b0d-4861-85e4-48166da8614c\") " pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.483586 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97xwt\" (UniqueName: \"kubernetes.io/projected/be27b311-0b0d-4861-85e4-48166da8614c-kube-api-access-97xwt\") pod \"auto-csr-approver-29556734-k9nqw\" (UID: \"be27b311-0b0d-4861-85e4-48166da8614c\") " pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.518464 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:00 crc kubenswrapper[4786]: I0313 12:14:00.968601 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-k9nqw"] Mar 13 12:14:01 crc kubenswrapper[4786]: I0313 12:14:01.457350 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" event={"ID":"be27b311-0b0d-4861-85e4-48166da8614c","Type":"ContainerStarted","Data":"21c7ec184792f58a95a34d110078c2dced34098e156ec390d20a3d09a00003f7"} Mar 13 12:14:02 crc kubenswrapper[4786]: I0313 12:14:02.467874 4786 generic.go:334] "Generic (PLEG): container finished" podID="be27b311-0b0d-4861-85e4-48166da8614c" containerID="06f6ba5eb3ee12cc3786bb0d68962fbb56f0f50aa7c77330b87079d6da1c0f94" exitCode=0 Mar 13 12:14:02 crc kubenswrapper[4786]: I0313 12:14:02.467999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" event={"ID":"be27b311-0b0d-4861-85e4-48166da8614c","Type":"ContainerDied","Data":"06f6ba5eb3ee12cc3786bb0d68962fbb56f0f50aa7c77330b87079d6da1c0f94"} Mar 13 12:14:03 crc kubenswrapper[4786]: I0313 12:14:03.760685 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:03 crc kubenswrapper[4786]: I0313 12:14:03.914610 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97xwt\" (UniqueName: \"kubernetes.io/projected/be27b311-0b0d-4861-85e4-48166da8614c-kube-api-access-97xwt\") pod \"be27b311-0b0d-4861-85e4-48166da8614c\" (UID: \"be27b311-0b0d-4861-85e4-48166da8614c\") " Mar 13 12:14:03 crc kubenswrapper[4786]: I0313 12:14:03.932427 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be27b311-0b0d-4861-85e4-48166da8614c-kube-api-access-97xwt" (OuterVolumeSpecName: "kube-api-access-97xwt") pod "be27b311-0b0d-4861-85e4-48166da8614c" (UID: "be27b311-0b0d-4861-85e4-48166da8614c"). InnerVolumeSpecName "kube-api-access-97xwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:14:04 crc kubenswrapper[4786]: I0313 12:14:04.017222 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97xwt\" (UniqueName: \"kubernetes.io/projected/be27b311-0b0d-4861-85e4-48166da8614c-kube-api-access-97xwt\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:04 crc kubenswrapper[4786]: I0313 12:14:04.488484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" event={"ID":"be27b311-0b0d-4861-85e4-48166da8614c","Type":"ContainerDied","Data":"21c7ec184792f58a95a34d110078c2dced34098e156ec390d20a3d09a00003f7"} Mar 13 12:14:04 crc kubenswrapper[4786]: I0313 12:14:04.488567 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c7ec184792f58a95a34d110078c2dced34098e156ec390d20a3d09a00003f7" Mar 13 12:14:04 crc kubenswrapper[4786]: I0313 12:14:04.488699 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-k9nqw" Mar 13 12:14:04 crc kubenswrapper[4786]: I0313 12:14:04.835032 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-nddsg"] Mar 13 12:14:04 crc kubenswrapper[4786]: I0313 12:14:04.842300 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-nddsg"] Mar 13 12:14:05 crc kubenswrapper[4786]: I0313 12:14:05.451359 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee76d39-03f4-4564-98f5-4903ea00568f" path="/var/lib/kubelet/pods/7ee76d39-03f4-4564-98f5-4903ea00568f/volumes" Mar 13 12:14:08 crc kubenswrapper[4786]: I0313 12:14:08.169386 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:14:08 crc kubenswrapper[4786]: I0313 12:14:08.170959 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:13 crc kubenswrapper[4786]: I0313 12:14:13.769730 4786 scope.go:117] "RemoveContainer" containerID="27d6eb8401490fb55d774c4f395089b4fb75b0cc2244cbaf43b5759b74129ca6" Mar 13 12:14:38 crc kubenswrapper[4786]: I0313 12:14:38.169507 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:14:38 crc kubenswrapper[4786]: I0313 12:14:38.170223 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.174368 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf"] Mar 13 12:15:00 crc kubenswrapper[4786]: E0313 12:15:00.175258 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be27b311-0b0d-4861-85e4-48166da8614c" containerName="oc" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.175276 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be27b311-0b0d-4861-85e4-48166da8614c" containerName="oc" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.175463 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="be27b311-0b0d-4861-85e4-48166da8614c" containerName="oc" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.176035 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.178787 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.178828 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.193104 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf"] Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.268338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-secret-volume\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.268393 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt94l\" (UniqueName: \"kubernetes.io/projected/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-kube-api-access-vt94l\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.268437 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-config-volume\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.370070 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-secret-volume\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.370120 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt94l\" (UniqueName: \"kubernetes.io/projected/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-kube-api-access-vt94l\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.370165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-config-volume\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.371238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-config-volume\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.376352 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-secret-volume\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.388131 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt94l\" (UniqueName: \"kubernetes.io/projected/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-kube-api-access-vt94l\") pod \"collect-profiles-29556735-r9fcf\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.517572 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:00 crc kubenswrapper[4786]: I0313 12:15:00.943065 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf"] Mar 13 12:15:01 crc kubenswrapper[4786]: I0313 12:15:01.024612 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" event={"ID":"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b","Type":"ContainerStarted","Data":"d679915608ecaf88c35976deda3467ceba80f56a2ab35e2be3f8349aa39fd790"} Mar 13 12:15:02 crc kubenswrapper[4786]: I0313 12:15:02.037106 4786 generic.go:334] "Generic (PLEG): container finished" podID="849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" containerID="0cb12496b8a6706cffc3915bd459e1f457e1f4388d632e87469bbb5ef8a2b453" exitCode=0 Mar 13 12:15:02 crc kubenswrapper[4786]: I0313 12:15:02.037514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" event={"ID":"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b","Type":"ContainerDied","Data":"0cb12496b8a6706cffc3915bd459e1f457e1f4388d632e87469bbb5ef8a2b453"} Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.418534 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.519844 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-secret-volume\") pod \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.519968 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-config-volume\") pod \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.520163 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt94l\" (UniqueName: \"kubernetes.io/projected/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-kube-api-access-vt94l\") pod \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\" (UID: \"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b\") " Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.520773 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-config-volume" (OuterVolumeSpecName: "config-volume") pod "849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" (UID: "849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.525835 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-kube-api-access-vt94l" (OuterVolumeSpecName: "kube-api-access-vt94l") pod "849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" (UID: "849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b"). InnerVolumeSpecName "kube-api-access-vt94l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.526241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" (UID: "849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.622597 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.622938 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:03 crc kubenswrapper[4786]: I0313 12:15:03.623038 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt94l\" (UniqueName: \"kubernetes.io/projected/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b-kube-api-access-vt94l\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:04 crc kubenswrapper[4786]: I0313 12:15:04.073286 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" event={"ID":"849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b","Type":"ContainerDied","Data":"d679915608ecaf88c35976deda3467ceba80f56a2ab35e2be3f8349aa39fd790"} Mar 13 12:15:04 crc kubenswrapper[4786]: I0313 12:15:04.073348 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d679915608ecaf88c35976deda3467ceba80f56a2ab35e2be3f8349aa39fd790" Mar 13 12:15:04 crc kubenswrapper[4786]: I0313 12:15:04.073364 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf" Mar 13 12:15:08 crc kubenswrapper[4786]: I0313 12:15:08.169843 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:15:08 crc kubenswrapper[4786]: I0313 12:15:08.170438 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:15:08 crc kubenswrapper[4786]: I0313 12:15:08.170492 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:15:08 crc kubenswrapper[4786]: I0313 12:15:08.171228 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:15:08 crc kubenswrapper[4786]: I0313 12:15:08.171487 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" gracePeriod=600 Mar 13 12:15:08 crc kubenswrapper[4786]: E0313 12:15:08.301106 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:15:09 crc kubenswrapper[4786]: I0313 12:15:09.128412 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" exitCode=0 Mar 13 12:15:09 crc kubenswrapper[4786]: I0313 12:15:09.128464 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754"} Mar 13 12:15:09 crc kubenswrapper[4786]: I0313 12:15:09.128525 4786 scope.go:117] "RemoveContainer" containerID="1e7dfa6aebd4ca8695c470b1c4f1a2306b0f2eefc624c2d634686a5a8cd4e40b" Mar 13 12:15:09 crc kubenswrapper[4786]: I0313 12:15:09.129393 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:15:09 crc kubenswrapper[4786]: E0313 12:15:09.129852 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.560397 4786 scope.go:117] "RemoveContainer" containerID="c5e3569be852ead07be105446ea4c198eaab3173139bc6c84679868abfdda561" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.603189 4786 scope.go:117] "RemoveContainer" containerID="0af6194d99a56d00d89b6a59c543eeee81163c06a3dddb5e0f1108fc2b69ec6e" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.648778 4786 scope.go:117] "RemoveContainer" containerID="e2c019ff0348a178bded3382ef62270a929b3de44dbd0995c92698df2bef3289" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.692330 4786 scope.go:117] "RemoveContainer" containerID="63d1c6a4ed628e0270bfaf6f6a59a54966e217bd3f5f6948011b4c84cc5c5d66" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.713742 4786 scope.go:117] "RemoveContainer" containerID="1e10720c51b5e71372255bc026ee7d2a7ce0fc88ce17d5796e1f682a8d8fef6c" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.764187 4786 scope.go:117] "RemoveContainer" containerID="85d10348665228b36bb7af791198dbb74fe9596aaf08a0b9b13a8d28493b4749" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.782126 4786 scope.go:117] "RemoveContainer" containerID="a95a3ffb3a7cfd95fc50a3707e38a7f96e07f8e41e18b54da33b225a494389a7" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.806659 4786 scope.go:117] "RemoveContainer" containerID="0c8aae5fcc18920f5dd30bba03e2ecfbcd982311b38cc18ee5401f8207fe92e9" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.840202 4786 scope.go:117] "RemoveContainer" containerID="366f7bdf5192896510fd5be7476e2f1fed3d2e1a33505671ebc219e89e72f9db" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.900420 4786 scope.go:117] "RemoveContainer" containerID="9ab0287f8e3b9501e6a4aadca9cf4e40efa14d5aa33a94b16e22d38625d8e92c" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.932356 4786 scope.go:117] "RemoveContainer" containerID="3e8ce702d9966b96d3a99192c9116bb2858d49fd20d91333e6186efe37b8fe1e" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.960186 4786 scope.go:117] "RemoveContainer" containerID="c47dabf207c6254379ea6b9544893c6b3c97efea63726763b4c7bf4d8a82c764" Mar 13 12:15:14 crc kubenswrapper[4786]: I0313 12:15:14.985481 4786 scope.go:117] "RemoveContainer" containerID="8546d15d615043030d104f666fcccae710b91eaabc4b545097a038170b3a7dcf" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.012017 4786 scope.go:117] "RemoveContainer" containerID="30476293489f29d0bf28a9f340cd844ff67d79f12b2a58ed298bb9282c465e69" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.066461 4786 scope.go:117] "RemoveContainer" containerID="c3d7a7117c0b8a182edf5a164208dda5d4e7f7b6662de2f2a8cf06ca833520b5" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.089205 4786 scope.go:117] "RemoveContainer" containerID="39984635e5585195e1c8d56b7a28c676d85cf6623b602e27e486afa71454a871" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.107259 4786 scope.go:117] "RemoveContainer" containerID="af9a58bcbaa96db6c049cf3f0beded05596724e061992aa2269d55f6577c1329" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.129601 4786 scope.go:117] "RemoveContainer" containerID="9234d8507c7d5b1040d1c4371fed627976b7224e46dea272c706f536bfab99b6" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.157047 4786 scope.go:117] "RemoveContainer" containerID="06cbe6fcaf2610a101d910170cbceb0e2a4ec74893cf6c2d3fde4fd12608b429" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.186503 4786 scope.go:117] "RemoveContainer" containerID="a80da01ce4b12b8ab209d5616c1d6a41c1d72b0e70b32ddcb0d4f8b3a0f1fd69" Mar 13 12:15:15 crc kubenswrapper[4786]: I0313 12:15:15.209120 4786 scope.go:117] "RemoveContainer" containerID="c29161d84a60eeddfde50252c6d91b69df94664233f5fe0d9fba02e17dc0a1ec" Mar 13 12:15:20 crc kubenswrapper[4786]: I0313 12:15:20.442297 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:15:20 crc kubenswrapper[4786]: E0313 12:15:20.443857 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:15:31 crc kubenswrapper[4786]: I0313 12:15:31.440795 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:15:31 crc kubenswrapper[4786]: E0313 12:15:31.442026 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:15:46 crc kubenswrapper[4786]: I0313 12:15:46.440950 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:15:46 crc kubenswrapper[4786]: E0313 12:15:46.442279 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.156711 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dtk4q"] Mar 13 12:15:48 crc kubenswrapper[4786]: E0313 12:15:48.157075 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" containerName="collect-profiles" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.157090 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" containerName="collect-profiles" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.157266 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" containerName="collect-profiles" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.158657 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.177429 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtk4q"] Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.247526 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6f9p\" (UniqueName: \"kubernetes.io/projected/face314d-1ec9-4abd-90a8-f2401f55160e-kube-api-access-z6f9p\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.247960 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-catalog-content\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.248017 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-utilities\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.350095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-catalog-content\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.350452 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-utilities\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.350617 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-catalog-content\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.350622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6f9p\" (UniqueName: \"kubernetes.io/projected/face314d-1ec9-4abd-90a8-f2401f55160e-kube-api-access-z6f9p\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.350934 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-utilities\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.373166 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6f9p\" (UniqueName: \"kubernetes.io/projected/face314d-1ec9-4abd-90a8-f2401f55160e-kube-api-access-z6f9p\") pod \"redhat-marketplace-dtk4q\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.484154 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:48 crc kubenswrapper[4786]: I0313 12:15:48.920605 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtk4q"] Mar 13 12:15:49 crc kubenswrapper[4786]: I0313 12:15:49.572152 4786 generic.go:334] "Generic (PLEG): container finished" podID="face314d-1ec9-4abd-90a8-f2401f55160e" containerID="faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd" exitCode=0 Mar 13 12:15:49 crc kubenswrapper[4786]: I0313 12:15:49.572406 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerDied","Data":"faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd"} Mar 13 12:15:49 crc kubenswrapper[4786]: I0313 12:15:49.572462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerStarted","Data":"99c81319a221d65ff393c445dc21a682224978acd2b1f11b77e1f028d234b0b6"} Mar 13 12:15:50 crc kubenswrapper[4786]: I0313 12:15:50.583306 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerStarted","Data":"144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e"} Mar 13 12:15:51 crc kubenswrapper[4786]: I0313 12:15:51.597241 4786 generic.go:334] "Generic (PLEG): container finished" podID="face314d-1ec9-4abd-90a8-f2401f55160e" containerID="144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e" exitCode=0 Mar 13 12:15:51 crc kubenswrapper[4786]: I0313 12:15:51.597535 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerDied","Data":"144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e"} Mar 13 12:15:52 crc kubenswrapper[4786]: I0313 12:15:52.607453 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerStarted","Data":"39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055"} Mar 13 12:15:52 crc kubenswrapper[4786]: I0313 12:15:52.633825 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dtk4q" podStartSLOduration=2.130970541 podStartE2EDuration="4.633806449s" podCreationTimestamp="2026-03-13 12:15:48 +0000 UTC" firstStartedPulling="2026-03-13 12:15:49.575202388 +0000 UTC m=+1736.854855835" lastFinishedPulling="2026-03-13 12:15:52.078038296 +0000 UTC m=+1739.357691743" observedRunningTime="2026-03-13 12:15:52.627750164 +0000 UTC m=+1739.907403631" watchObservedRunningTime="2026-03-13 12:15:52.633806449 +0000 UTC m=+1739.913459906" Mar 13 12:15:57 crc kubenswrapper[4786]: I0313 12:15:57.441187 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:15:57 crc kubenswrapper[4786]: E0313 12:15:57.441939 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:15:58 crc kubenswrapper[4786]: I0313 12:15:58.484954 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:58 crc kubenswrapper[4786]: I0313 12:15:58.485107 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:58 crc kubenswrapper[4786]: I0313 12:15:58.562935 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:58 crc kubenswrapper[4786]: I0313 12:15:58.700124 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:15:58 crc kubenswrapper[4786]: I0313 12:15:58.805038 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtk4q"] Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.166352 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556736-dhfvw"] Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.167936 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.171340 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.174051 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.174100 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.177379 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-dhfvw"] Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.225607 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwlq9\" (UniqueName: \"kubernetes.io/projected/78a324a1-864f-45b2-9932-058bed4ae9e2-kube-api-access-mwlq9\") pod \"auto-csr-approver-29556736-dhfvw\" (UID: \"78a324a1-864f-45b2-9932-058bed4ae9e2\") " pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.327285 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwlq9\" (UniqueName: \"kubernetes.io/projected/78a324a1-864f-45b2-9932-058bed4ae9e2-kube-api-access-mwlq9\") pod \"auto-csr-approver-29556736-dhfvw\" (UID: \"78a324a1-864f-45b2-9932-058bed4ae9e2\") " pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.360189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwlq9\" (UniqueName: \"kubernetes.io/projected/78a324a1-864f-45b2-9932-058bed4ae9e2-kube-api-access-mwlq9\") pod \"auto-csr-approver-29556736-dhfvw\" (UID: \"78a324a1-864f-45b2-9932-058bed4ae9e2\") " pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.488796 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.673005 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dtk4q" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="registry-server" containerID="cri-o://39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055" gracePeriod=2 Mar 13 12:16:00 crc kubenswrapper[4786]: I0313 12:16:00.810633 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-dhfvw"] Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.629075 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.682855 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" event={"ID":"78a324a1-864f-45b2-9932-058bed4ae9e2","Type":"ContainerStarted","Data":"5bb836455b196d312aba727e5df0872d6ac2f8064cdb3b6703aebd0d7387efa3"} Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.684953 4786 generic.go:334] "Generic (PLEG): container finished" podID="face314d-1ec9-4abd-90a8-f2401f55160e" containerID="39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055" exitCode=0 Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.684993 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerDied","Data":"39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055"} Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.685026 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtk4q" event={"ID":"face314d-1ec9-4abd-90a8-f2401f55160e","Type":"ContainerDied","Data":"99c81319a221d65ff393c445dc21a682224978acd2b1f11b77e1f028d234b0b6"} Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.685047 4786 scope.go:117] "RemoveContainer" containerID="39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.685047 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtk4q" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.712356 4786 scope.go:117] "RemoveContainer" containerID="144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.727703 4786 scope.go:117] "RemoveContainer" containerID="faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.749813 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6f9p\" (UniqueName: \"kubernetes.io/projected/face314d-1ec9-4abd-90a8-f2401f55160e-kube-api-access-z6f9p\") pod \"face314d-1ec9-4abd-90a8-f2401f55160e\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.749908 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-utilities\") pod \"face314d-1ec9-4abd-90a8-f2401f55160e\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.749973 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-catalog-content\") pod \"face314d-1ec9-4abd-90a8-f2401f55160e\" (UID: \"face314d-1ec9-4abd-90a8-f2401f55160e\") " Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.750785 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-utilities" (OuterVolumeSpecName: "utilities") pod "face314d-1ec9-4abd-90a8-f2401f55160e" (UID: "face314d-1ec9-4abd-90a8-f2401f55160e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.754077 4786 scope.go:117] "RemoveContainer" containerID="39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055" Mar 13 12:16:01 crc kubenswrapper[4786]: E0313 12:16:01.754857 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055\": container with ID starting with 39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055 not found: ID does not exist" containerID="39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.755100 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055"} err="failed to get container status \"39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055\": rpc error: code = NotFound desc = could not find container \"39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055\": container with ID starting with 39f9e4524204015d8de482fcc4e70393d556f6b6fc90fb8536d89a547a23f055 not found: ID does not exist" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.755120 4786 scope.go:117] "RemoveContainer" containerID="144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e" Mar 13 12:16:01 crc kubenswrapper[4786]: E0313 12:16:01.755387 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e\": container with ID starting with 144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e not found: ID does not exist" containerID="144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.755407 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e"} err="failed to get container status \"144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e\": rpc error: code = NotFound desc = could not find container \"144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e\": container with ID starting with 144c632ca7c4a0010d309d42a6b9d2918c7621f4f0ec7b2876c2ea601366f02e not found: ID does not exist" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.755420 4786 scope.go:117] "RemoveContainer" containerID="faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd" Mar 13 12:16:01 crc kubenswrapper[4786]: E0313 12:16:01.755616 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd\": container with ID starting with faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd not found: ID does not exist" containerID="faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.755654 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd"} err="failed to get container status \"faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd\": rpc error: code = NotFound desc = could not find container \"faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd\": container with ID starting with faf6d8fa0df116ed9180bb119a0565b6662e4172affd359104771ad4fef128dd not found: ID does not exist" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.756202 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/face314d-1ec9-4abd-90a8-f2401f55160e-kube-api-access-z6f9p" (OuterVolumeSpecName: "kube-api-access-z6f9p") pod "face314d-1ec9-4abd-90a8-f2401f55160e" (UID: "face314d-1ec9-4abd-90a8-f2401f55160e"). InnerVolumeSpecName "kube-api-access-z6f9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.779683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "face314d-1ec9-4abd-90a8-f2401f55160e" (UID: "face314d-1ec9-4abd-90a8-f2401f55160e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.851559 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6f9p\" (UniqueName: \"kubernetes.io/projected/face314d-1ec9-4abd-90a8-f2401f55160e-kube-api-access-z6f9p\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.851596 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:01 crc kubenswrapper[4786]: I0313 12:16:01.851608 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/face314d-1ec9-4abd-90a8-f2401f55160e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:02 crc kubenswrapper[4786]: I0313 12:16:02.032719 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtk4q"] Mar 13 12:16:02 crc kubenswrapper[4786]: I0313 12:16:02.040225 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtk4q"] Mar 13 12:16:02 crc kubenswrapper[4786]: I0313 12:16:02.697416 4786 generic.go:334] "Generic (PLEG): container finished" podID="78a324a1-864f-45b2-9932-058bed4ae9e2" containerID="c2e095477616881d112fb30963985bde7f9d10c33ae01bde5de8706d3f8edf15" exitCode=0 Mar 13 12:16:02 crc kubenswrapper[4786]: I0313 12:16:02.697464 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" event={"ID":"78a324a1-864f-45b2-9932-058bed4ae9e2","Type":"ContainerDied","Data":"c2e095477616881d112fb30963985bde7f9d10c33ae01bde5de8706d3f8edf15"} Mar 13 12:16:03 crc kubenswrapper[4786]: I0313 12:16:03.453667 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" path="/var/lib/kubelet/pods/face314d-1ec9-4abd-90a8-f2401f55160e/volumes" Mar 13 12:16:03 crc kubenswrapper[4786]: I0313 12:16:03.945569 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:04 crc kubenswrapper[4786]: I0313 12:16:04.090796 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwlq9\" (UniqueName: \"kubernetes.io/projected/78a324a1-864f-45b2-9932-058bed4ae9e2-kube-api-access-mwlq9\") pod \"78a324a1-864f-45b2-9932-058bed4ae9e2\" (UID: \"78a324a1-864f-45b2-9932-058bed4ae9e2\") " Mar 13 12:16:04 crc kubenswrapper[4786]: I0313 12:16:04.096352 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a324a1-864f-45b2-9932-058bed4ae9e2-kube-api-access-mwlq9" (OuterVolumeSpecName: "kube-api-access-mwlq9") pod "78a324a1-864f-45b2-9932-058bed4ae9e2" (UID: "78a324a1-864f-45b2-9932-058bed4ae9e2"). InnerVolumeSpecName "kube-api-access-mwlq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:16:04 crc kubenswrapper[4786]: I0313 12:16:04.192685 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwlq9\" (UniqueName: \"kubernetes.io/projected/78a324a1-864f-45b2-9932-058bed4ae9e2-kube-api-access-mwlq9\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:04 crc kubenswrapper[4786]: I0313 12:16:04.720767 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" event={"ID":"78a324a1-864f-45b2-9932-058bed4ae9e2","Type":"ContainerDied","Data":"5bb836455b196d312aba727e5df0872d6ac2f8064cdb3b6703aebd0d7387efa3"} Mar 13 12:16:04 crc kubenswrapper[4786]: I0313 12:16:04.720822 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb836455b196d312aba727e5df0872d6ac2f8064cdb3b6703aebd0d7387efa3" Mar 13 12:16:04 crc kubenswrapper[4786]: I0313 12:16:04.721161 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-dhfvw" Mar 13 12:16:05 crc kubenswrapper[4786]: I0313 12:16:05.010261 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-n78jr"] Mar 13 12:16:05 crc kubenswrapper[4786]: I0313 12:16:05.016728 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-n78jr"] Mar 13 12:16:05 crc kubenswrapper[4786]: I0313 12:16:05.452209 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855c715a-2a47-4dc6-ac8c-d5443ab2f0f9" path="/var/lib/kubelet/pods/855c715a-2a47-4dc6-ac8c-d5443ab2f0f9/volumes" Mar 13 12:16:12 crc kubenswrapper[4786]: I0313 12:16:12.440399 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:16:12 crc kubenswrapper[4786]: E0313 12:16:12.442370 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.530792 4786 scope.go:117] "RemoveContainer" containerID="766a2819d96407236e8a3bd4f525acafc200a2bab1d1ad0bf70c72c3c07ecc3c" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.577802 4786 scope.go:117] "RemoveContainer" containerID="0d7e4010821ab2b1ddac83a1076c1a1e388750a9cd4820f6725889bda766f5ea" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.626798 4786 scope.go:117] "RemoveContainer" containerID="fc27704e4bdbce3b44659c515a380b307831cf8584a32d829b419b58f811d1da" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.666132 4786 scope.go:117] "RemoveContainer" containerID="b4a02fc023117f09e3d3e5a7efa55580e7dde6907de4ab87c55e391577ab23a0" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.687373 4786 scope.go:117] "RemoveContainer" containerID="7da91ebeef699e9de315563c6fce678e89e604873a19738404bc1764f949d732" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.717757 4786 scope.go:117] "RemoveContainer" containerID="87c9337e5f8be7831921a2cc00598ccb8555b8faf4b6f237f998d7e8ccce7644" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.743999 4786 scope.go:117] "RemoveContainer" containerID="3ffd782cc85ee75660750d9e43f93d793c70c06599115717af65c7e77938cc2e" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.785794 4786 scope.go:117] "RemoveContainer" containerID="2f45e2aaf79647e376432ad16e19542a71767496aecdffdb4d6cc63e555f3298" Mar 13 12:16:15 crc kubenswrapper[4786]: I0313 12:16:15.826065 4786 scope.go:117] "RemoveContainer" containerID="ae16e2216939862263bfe245efece6c23823d38bfbc785950f78f2415d0c22ac" Mar 13 12:16:26 crc kubenswrapper[4786]: I0313 12:16:26.440865 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:16:26 crc kubenswrapper[4786]: E0313 12:16:26.441947 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:16:37 crc kubenswrapper[4786]: I0313 12:16:37.440400 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:16:37 crc kubenswrapper[4786]: E0313 12:16:37.441209 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:16:52 crc kubenswrapper[4786]: I0313 12:16:52.441138 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:16:52 crc kubenswrapper[4786]: E0313 12:16:52.442328 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:17:06 crc kubenswrapper[4786]: I0313 12:17:06.441063 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:17:06 crc kubenswrapper[4786]: E0313 12:17:06.442104 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:17:15 crc kubenswrapper[4786]: I0313 12:17:15.997311 4786 scope.go:117] "RemoveContainer" containerID="7d62ff3e67b7bcfe3dfbaba001479ada67c0df7d19973a69f85cee8431f830b1" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.027691 4786 scope.go:117] "RemoveContainer" containerID="43af16d54d57e48978f9e8c2ddeef30020c41ef69f7a4f2ae087170c105f9dc4" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.062559 4786 scope.go:117] "RemoveContainer" containerID="a4508a806bab3c42762b33d4b026a3718e0e71d0df4154080451f18edf45ef82" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.096940 4786 scope.go:117] "RemoveContainer" containerID="8d6a8cb1e53e6f557d6e7eb5500d9064d12283cd86a24ee5003fc06a11a56be3" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.127890 4786 scope.go:117] "RemoveContainer" containerID="493eb4efaa56ff53a7930283def29db5ea69112fa0c04f9828e3b5e04a2fc1b9" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.149935 4786 scope.go:117] "RemoveContainer" containerID="6faefcc8f8f08a959c2efe031b6171c7238793dbc61001559ea27795b9e169c2" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.169319 4786 scope.go:117] "RemoveContainer" containerID="bd18ddf3196c9bf5a4a9b83508e122c58fe4f289252ce5dccbca45f28bb401b8" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.184335 4786 scope.go:117] "RemoveContainer" containerID="194f868d7b6d5ddac02b8dc95f77ded38e44b17bfc81fa80fb0e1e9b550a6775" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.200712 4786 scope.go:117] "RemoveContainer" containerID="7d26ec1d9ece2a37cced85d6beca1eecc881de15084de4e7ac289a248517ef2c" Mar 13 12:17:16 crc kubenswrapper[4786]: I0313 12:17:16.240488 4786 scope.go:117] "RemoveContainer" containerID="8f4a77d63c9b8e9d37ac53c4d16ee3f4eecc62fce4ba87cde9159abeab1c14a2" Mar 13 12:17:18 crc kubenswrapper[4786]: I0313 12:17:18.441202 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:17:18 crc kubenswrapper[4786]: E0313 12:17:18.441829 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:17:31 crc kubenswrapper[4786]: I0313 12:17:31.440289 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:17:31 crc kubenswrapper[4786]: E0313 12:17:31.441109 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:17:43 crc kubenswrapper[4786]: I0313 12:17:43.444476 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:17:43 crc kubenswrapper[4786]: E0313 12:17:43.445319 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:17:57 crc kubenswrapper[4786]: I0313 12:17:57.441387 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:17:57 crc kubenswrapper[4786]: E0313 12:17:57.442294 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.141247 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556738-m5xp8"] Mar 13 12:18:00 crc kubenswrapper[4786]: E0313 12:18:00.141939 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a324a1-864f-45b2-9932-058bed4ae9e2" containerName="oc" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.141956 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a324a1-864f-45b2-9932-058bed4ae9e2" containerName="oc" Mar 13 12:18:00 crc kubenswrapper[4786]: E0313 12:18:00.141984 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="extract-content" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.141995 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="extract-content" Mar 13 12:18:00 crc kubenswrapper[4786]: E0313 12:18:00.142010 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="registry-server" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.142018 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="registry-server" Mar 13 12:18:00 crc kubenswrapper[4786]: E0313 12:18:00.142035 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="extract-utilities" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.142042 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="extract-utilities" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.142192 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a324a1-864f-45b2-9932-058bed4ae9e2" containerName="oc" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.142210 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="face314d-1ec9-4abd-90a8-f2401f55160e" containerName="registry-server" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.142721 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.144218 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.145135 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.152204 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.159181 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-m5xp8"] Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.243820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptrtk\" (UniqueName: \"kubernetes.io/projected/f6296ace-d6a9-45e3-81ac-20c708bc3588-kube-api-access-ptrtk\") pod \"auto-csr-approver-29556738-m5xp8\" (UID: \"f6296ace-d6a9-45e3-81ac-20c708bc3588\") " pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.344833 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptrtk\" (UniqueName: \"kubernetes.io/projected/f6296ace-d6a9-45e3-81ac-20c708bc3588-kube-api-access-ptrtk\") pod \"auto-csr-approver-29556738-m5xp8\" (UID: \"f6296ace-d6a9-45e3-81ac-20c708bc3588\") " pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.361351 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptrtk\" (UniqueName: \"kubernetes.io/projected/f6296ace-d6a9-45e3-81ac-20c708bc3588-kube-api-access-ptrtk\") pod \"auto-csr-approver-29556738-m5xp8\" (UID: \"f6296ace-d6a9-45e3-81ac-20c708bc3588\") " pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.460332 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.884094 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-m5xp8"] Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.903013 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:18:00 crc kubenswrapper[4786]: I0313 12:18:00.944454 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" event={"ID":"f6296ace-d6a9-45e3-81ac-20c708bc3588","Type":"ContainerStarted","Data":"7a866ac36e5fd8b77aa21768359a7fb6e34d72af85dda0d8e6e54e61e1572baa"} Mar 13 12:18:02 crc kubenswrapper[4786]: I0313 12:18:02.957129 4786 generic.go:334] "Generic (PLEG): container finished" podID="f6296ace-d6a9-45e3-81ac-20c708bc3588" containerID="98e46c3b3f266db9d9db2ab4bf3b3572a4b24c93091763d0c4c6e10669d44227" exitCode=0 Mar 13 12:18:02 crc kubenswrapper[4786]: I0313 12:18:02.957183 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" event={"ID":"f6296ace-d6a9-45e3-81ac-20c708bc3588","Type":"ContainerDied","Data":"98e46c3b3f266db9d9db2ab4bf3b3572a4b24c93091763d0c4c6e10669d44227"} Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.250296 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.302190 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptrtk\" (UniqueName: \"kubernetes.io/projected/f6296ace-d6a9-45e3-81ac-20c708bc3588-kube-api-access-ptrtk\") pod \"f6296ace-d6a9-45e3-81ac-20c708bc3588\" (UID: \"f6296ace-d6a9-45e3-81ac-20c708bc3588\") " Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.308345 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6296ace-d6a9-45e3-81ac-20c708bc3588-kube-api-access-ptrtk" (OuterVolumeSpecName: "kube-api-access-ptrtk") pod "f6296ace-d6a9-45e3-81ac-20c708bc3588" (UID: "f6296ace-d6a9-45e3-81ac-20c708bc3588"). InnerVolumeSpecName "kube-api-access-ptrtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.403243 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptrtk\" (UniqueName: \"kubernetes.io/projected/f6296ace-d6a9-45e3-81ac-20c708bc3588-kube-api-access-ptrtk\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.974275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" event={"ID":"f6296ace-d6a9-45e3-81ac-20c708bc3588","Type":"ContainerDied","Data":"7a866ac36e5fd8b77aa21768359a7fb6e34d72af85dda0d8e6e54e61e1572baa"} Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.974323 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a866ac36e5fd8b77aa21768359a7fb6e34d72af85dda0d8e6e54e61e1572baa" Mar 13 12:18:04 crc kubenswrapper[4786]: I0313 12:18:04.974379 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-m5xp8" Mar 13 12:18:05 crc kubenswrapper[4786]: I0313 12:18:05.316260 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-fkzcg"] Mar 13 12:18:05 crc kubenswrapper[4786]: I0313 12:18:05.324281 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-fkzcg"] Mar 13 12:18:05 crc kubenswrapper[4786]: I0313 12:18:05.448916 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e83607-3ddd-4f8f-885d-b723affa2133" path="/var/lib/kubelet/pods/f6e83607-3ddd-4f8f-885d-b723affa2133/volumes" Mar 13 12:18:09 crc kubenswrapper[4786]: I0313 12:18:09.440786 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:18:09 crc kubenswrapper[4786]: E0313 12:18:09.442103 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.926521 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5m72s"] Mar 13 12:18:10 crc kubenswrapper[4786]: E0313 12:18:10.926895 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6296ace-d6a9-45e3-81ac-20c708bc3588" containerName="oc" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.926911 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6296ace-d6a9-45e3-81ac-20c708bc3588" containerName="oc" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.927095 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6296ace-d6a9-45e3-81ac-20c708bc3588" containerName="oc" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.928250 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.936601 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5m72s"] Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.996855 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-utilities\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.997507 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-catalog-content\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:10 crc kubenswrapper[4786]: I0313 12:18:10.997604 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmzg6\" (UniqueName: \"kubernetes.io/projected/d6c959b7-9559-41f8-8100-bd25228919cb-kube-api-access-rmzg6\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.098378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-catalog-content\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.098422 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmzg6\" (UniqueName: \"kubernetes.io/projected/d6c959b7-9559-41f8-8100-bd25228919cb-kube-api-access-rmzg6\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.098486 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-utilities\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.098937 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-utilities\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.099159 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-catalog-content\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.139597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmzg6\" (UniqueName: \"kubernetes.io/projected/d6c959b7-9559-41f8-8100-bd25228919cb-kube-api-access-rmzg6\") pod \"certified-operators-5m72s\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.263140 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:11 crc kubenswrapper[4786]: I0313 12:18:11.726942 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5m72s"] Mar 13 12:18:12 crc kubenswrapper[4786]: I0313 12:18:12.025159 4786 generic.go:334] "Generic (PLEG): container finished" podID="d6c959b7-9559-41f8-8100-bd25228919cb" containerID="c5b11412a3e319e6b0f400fabfc9c1658eb318e4afc49331a5dded4f28ebc208" exitCode=0 Mar 13 12:18:12 crc kubenswrapper[4786]: I0313 12:18:12.025208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerDied","Data":"c5b11412a3e319e6b0f400fabfc9c1658eb318e4afc49331a5dded4f28ebc208"} Mar 13 12:18:12 crc kubenswrapper[4786]: I0313 12:18:12.025246 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerStarted","Data":"1945ec75312a9d91580979c49f7103b5c7af89727569fb83d50e59121c0f6e6f"} Mar 13 12:18:13 crc kubenswrapper[4786]: I0313 12:18:13.033951 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerStarted","Data":"ebb35dfaf62e178aedaabb6674f9c4d4f9f977a2d3905f341c1a555094c19156"} Mar 13 12:18:14 crc kubenswrapper[4786]: I0313 12:18:14.043844 4786 generic.go:334] "Generic (PLEG): container finished" podID="d6c959b7-9559-41f8-8100-bd25228919cb" containerID="ebb35dfaf62e178aedaabb6674f9c4d4f9f977a2d3905f341c1a555094c19156" exitCode=0 Mar 13 12:18:14 crc kubenswrapper[4786]: I0313 12:18:14.043944 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerDied","Data":"ebb35dfaf62e178aedaabb6674f9c4d4f9f977a2d3905f341c1a555094c19156"} Mar 13 12:18:15 crc kubenswrapper[4786]: I0313 12:18:15.052679 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerStarted","Data":"3ac1e95d8e7ffa28ad84e44be51ce2c73986fb9dc513972f216acd7c212a08f0"} Mar 13 12:18:15 crc kubenswrapper[4786]: I0313 12:18:15.076235 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5m72s" podStartSLOduration=2.619897726 podStartE2EDuration="5.076213303s" podCreationTimestamp="2026-03-13 12:18:10 +0000 UTC" firstStartedPulling="2026-03-13 12:18:12.02671596 +0000 UTC m=+1879.306369407" lastFinishedPulling="2026-03-13 12:18:14.483031537 +0000 UTC m=+1881.762684984" observedRunningTime="2026-03-13 12:18:15.071007672 +0000 UTC m=+1882.350661119" watchObservedRunningTime="2026-03-13 12:18:15.076213303 +0000 UTC m=+1882.355866750" Mar 13 12:18:16 crc kubenswrapper[4786]: I0313 12:18:16.380070 4786 scope.go:117] "RemoveContainer" containerID="ef263589f071e63e4efd997f22657bcddc785401bde3484b7cdc186235176ae8" Mar 13 12:18:16 crc kubenswrapper[4786]: I0313 12:18:16.404315 4786 scope.go:117] "RemoveContainer" containerID="65837bac0546f59350c540914448df362ce0c0ef074546abc8ea87fee95af8a5" Mar 13 12:18:16 crc kubenswrapper[4786]: I0313 12:18:16.452068 4786 scope.go:117] "RemoveContainer" containerID="e9fc0d6300970d7f8d6d459a014308012ae54ecd24efc235ec0affdfe4f8a2a3" Mar 13 12:18:16 crc kubenswrapper[4786]: I0313 12:18:16.488704 4786 scope.go:117] "RemoveContainer" containerID="c4439fbf918b3a73b489027b0f4677a7f89a0a3677dcff0e1750fe03ad5bddb6" Mar 13 12:18:16 crc kubenswrapper[4786]: I0313 12:18:16.526724 4786 scope.go:117] "RemoveContainer" containerID="6f64281da4aa8cf483b737b3b0a3c14552d8ab3d54b5e412e6af976479c43180" Mar 13 12:18:21 crc kubenswrapper[4786]: I0313 12:18:21.263913 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:21 crc kubenswrapper[4786]: I0313 12:18:21.264301 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:21 crc kubenswrapper[4786]: I0313 12:18:21.309511 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:21 crc kubenswrapper[4786]: I0313 12:18:21.441228 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:18:21 crc kubenswrapper[4786]: E0313 12:18:21.441721 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:18:22 crc kubenswrapper[4786]: I0313 12:18:22.147634 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:24 crc kubenswrapper[4786]: I0313 12:18:24.722325 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5m72s"] Mar 13 12:18:24 crc kubenswrapper[4786]: I0313 12:18:24.722690 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5m72s" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="registry-server" containerID="cri-o://3ac1e95d8e7ffa28ad84e44be51ce2c73986fb9dc513972f216acd7c212a08f0" gracePeriod=2 Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.126279 4786 generic.go:334] "Generic (PLEG): container finished" podID="d6c959b7-9559-41f8-8100-bd25228919cb" containerID="3ac1e95d8e7ffa28ad84e44be51ce2c73986fb9dc513972f216acd7c212a08f0" exitCode=0 Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.126363 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerDied","Data":"3ac1e95d8e7ffa28ad84e44be51ce2c73986fb9dc513972f216acd7c212a08f0"} Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.224493 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.319055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-catalog-content\") pod \"d6c959b7-9559-41f8-8100-bd25228919cb\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.319121 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-utilities\") pod \"d6c959b7-9559-41f8-8100-bd25228919cb\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.319230 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmzg6\" (UniqueName: \"kubernetes.io/projected/d6c959b7-9559-41f8-8100-bd25228919cb-kube-api-access-rmzg6\") pod \"d6c959b7-9559-41f8-8100-bd25228919cb\" (UID: \"d6c959b7-9559-41f8-8100-bd25228919cb\") " Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.320315 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-utilities" (OuterVolumeSpecName: "utilities") pod "d6c959b7-9559-41f8-8100-bd25228919cb" (UID: "d6c959b7-9559-41f8-8100-bd25228919cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.329221 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c959b7-9559-41f8-8100-bd25228919cb-kube-api-access-rmzg6" (OuterVolumeSpecName: "kube-api-access-rmzg6") pod "d6c959b7-9559-41f8-8100-bd25228919cb" (UID: "d6c959b7-9559-41f8-8100-bd25228919cb"). InnerVolumeSpecName "kube-api-access-rmzg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.379161 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6c959b7-9559-41f8-8100-bd25228919cb" (UID: "d6c959b7-9559-41f8-8100-bd25228919cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.420800 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.420831 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6c959b7-9559-41f8-8100-bd25228919cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:25 crc kubenswrapper[4786]: I0313 12:18:25.420840 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmzg6\" (UniqueName: \"kubernetes.io/projected/d6c959b7-9559-41f8-8100-bd25228919cb-kube-api-access-rmzg6\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.134992 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5m72s" event={"ID":"d6c959b7-9559-41f8-8100-bd25228919cb","Type":"ContainerDied","Data":"1945ec75312a9d91580979c49f7103b5c7af89727569fb83d50e59121c0f6e6f"} Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.135058 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5m72s" Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.135416 4786 scope.go:117] "RemoveContainer" containerID="3ac1e95d8e7ffa28ad84e44be51ce2c73986fb9dc513972f216acd7c212a08f0" Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.165597 4786 scope.go:117] "RemoveContainer" containerID="ebb35dfaf62e178aedaabb6674f9c4d4f9f977a2d3905f341c1a555094c19156" Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.166675 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5m72s"] Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.173141 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5m72s"] Mar 13 12:18:26 crc kubenswrapper[4786]: I0313 12:18:26.186151 4786 scope.go:117] "RemoveContainer" containerID="c5b11412a3e319e6b0f400fabfc9c1658eb318e4afc49331a5dded4f28ebc208" Mar 13 12:18:27 crc kubenswrapper[4786]: I0313 12:18:27.450592 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" path="/var/lib/kubelet/pods/d6c959b7-9559-41f8-8100-bd25228919cb/volumes" Mar 13 12:18:35 crc kubenswrapper[4786]: I0313 12:18:35.440480 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:18:35 crc kubenswrapper[4786]: E0313 12:18:35.441273 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:18:48 crc kubenswrapper[4786]: I0313 12:18:48.441256 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:18:48 crc kubenswrapper[4786]: E0313 12:18:48.445146 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:19:02 crc kubenswrapper[4786]: I0313 12:19:02.441969 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:19:02 crc kubenswrapper[4786]: E0313 12:19:02.442781 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:19:14 crc kubenswrapper[4786]: I0313 12:19:14.441274 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:19:14 crc kubenswrapper[4786]: E0313 12:19:14.442262 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:19:16 crc kubenswrapper[4786]: I0313 12:19:16.588116 4786 scope.go:117] "RemoveContainer" containerID="c90607a25b6719b906805f4956767bf9e8f2062f95bfc51dac8f6059d27ae384" Mar 13 12:19:16 crc kubenswrapper[4786]: I0313 12:19:16.623220 4786 scope.go:117] "RemoveContainer" containerID="1b0b367e7cd0a1267707201fcc6eb17e95461077f6d3e9b86822b55b231ea0c0" Mar 13 12:19:16 crc kubenswrapper[4786]: I0313 12:19:16.647514 4786 scope.go:117] "RemoveContainer" containerID="6ca4ed1353fe2122e66b7cdc238326a51066ac9f00f84fe43e52d17f553e850a" Mar 13 12:19:16 crc kubenswrapper[4786]: I0313 12:19:16.673509 4786 scope.go:117] "RemoveContainer" containerID="19ed6f38037a55c43058db0a67693dffe38372d306c408426bb30752659582c5" Mar 13 12:19:16 crc kubenswrapper[4786]: I0313 12:19:16.702220 4786 scope.go:117] "RemoveContainer" containerID="9aba29aa33189388ae65aa54d679b66d924f38fcf6996f9e7909443730f648f3" Mar 13 12:19:26 crc kubenswrapper[4786]: I0313 12:19:26.440773 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:19:26 crc kubenswrapper[4786]: E0313 12:19:26.441979 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:19:40 crc kubenswrapper[4786]: I0313 12:19:40.440606 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:19:40 crc kubenswrapper[4786]: E0313 12:19:40.441457 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:19:53 crc kubenswrapper[4786]: I0313 12:19:53.445918 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:19:53 crc kubenswrapper[4786]: E0313 12:19:53.446581 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.147425 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556740-mwm65"] Mar 13 12:20:00 crc kubenswrapper[4786]: E0313 12:20:00.148285 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="registry-server" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.148300 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="registry-server" Mar 13 12:20:00 crc kubenswrapper[4786]: E0313 12:20:00.148316 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="extract-content" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.148326 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="extract-content" Mar 13 12:20:00 crc kubenswrapper[4786]: E0313 12:20:00.148336 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="extract-utilities" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.148344 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="extract-utilities" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.148537 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c959b7-9559-41f8-8100-bd25228919cb" containerName="registry-server" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.149135 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.152554 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.152840 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.153065 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.155109 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-mwm65"] Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.257662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tqt\" (UniqueName: \"kubernetes.io/projected/1650bdb1-4fde-4823-96eb-5f2b8a273eba-kube-api-access-k7tqt\") pod \"auto-csr-approver-29556740-mwm65\" (UID: \"1650bdb1-4fde-4823-96eb-5f2b8a273eba\") " pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.358704 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tqt\" (UniqueName: \"kubernetes.io/projected/1650bdb1-4fde-4823-96eb-5f2b8a273eba-kube-api-access-k7tqt\") pod \"auto-csr-approver-29556740-mwm65\" (UID: \"1650bdb1-4fde-4823-96eb-5f2b8a273eba\") " pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.379056 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tqt\" (UniqueName: \"kubernetes.io/projected/1650bdb1-4fde-4823-96eb-5f2b8a273eba-kube-api-access-k7tqt\") pod \"auto-csr-approver-29556740-mwm65\" (UID: \"1650bdb1-4fde-4823-96eb-5f2b8a273eba\") " pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.492765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.765854 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-mwm65"] Mar 13 12:20:00 crc kubenswrapper[4786]: I0313 12:20:00.914356 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-mwm65" event={"ID":"1650bdb1-4fde-4823-96eb-5f2b8a273eba","Type":"ContainerStarted","Data":"c56a7c4a9906bcb010d615487944be9ec13f818348507289e38780f549e303f5"} Mar 13 12:20:02 crc kubenswrapper[4786]: I0313 12:20:02.930167 4786 generic.go:334] "Generic (PLEG): container finished" podID="1650bdb1-4fde-4823-96eb-5f2b8a273eba" containerID="e06be280fff84559113fc0537f29bfbb88391f35f68a8540cbcb0a93adbeb97b" exitCode=0 Mar 13 12:20:02 crc kubenswrapper[4786]: I0313 12:20:02.930240 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-mwm65" event={"ID":"1650bdb1-4fde-4823-96eb-5f2b8a273eba","Type":"ContainerDied","Data":"e06be280fff84559113fc0537f29bfbb88391f35f68a8540cbcb0a93adbeb97b"} Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.208656 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.328293 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7tqt\" (UniqueName: \"kubernetes.io/projected/1650bdb1-4fde-4823-96eb-5f2b8a273eba-kube-api-access-k7tqt\") pod \"1650bdb1-4fde-4823-96eb-5f2b8a273eba\" (UID: \"1650bdb1-4fde-4823-96eb-5f2b8a273eba\") " Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.334231 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1650bdb1-4fde-4823-96eb-5f2b8a273eba-kube-api-access-k7tqt" (OuterVolumeSpecName: "kube-api-access-k7tqt") pod "1650bdb1-4fde-4823-96eb-5f2b8a273eba" (UID: "1650bdb1-4fde-4823-96eb-5f2b8a273eba"). InnerVolumeSpecName "kube-api-access-k7tqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.429956 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7tqt\" (UniqueName: \"kubernetes.io/projected/1650bdb1-4fde-4823-96eb-5f2b8a273eba-kube-api-access-k7tqt\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.947855 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-mwm65" event={"ID":"1650bdb1-4fde-4823-96eb-5f2b8a273eba","Type":"ContainerDied","Data":"c56a7c4a9906bcb010d615487944be9ec13f818348507289e38780f549e303f5"} Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.948103 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56a7c4a9906bcb010d615487944be9ec13f818348507289e38780f549e303f5" Mar 13 12:20:04 crc kubenswrapper[4786]: I0313 12:20:04.947978 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-mwm65" Mar 13 12:20:05 crc kubenswrapper[4786]: I0313 12:20:05.282607 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-k9nqw"] Mar 13 12:20:05 crc kubenswrapper[4786]: I0313 12:20:05.287367 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-k9nqw"] Mar 13 12:20:05 crc kubenswrapper[4786]: I0313 12:20:05.457577 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be27b311-0b0d-4861-85e4-48166da8614c" path="/var/lib/kubelet/pods/be27b311-0b0d-4861-85e4-48166da8614c/volumes" Mar 13 12:20:07 crc kubenswrapper[4786]: I0313 12:20:07.440754 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:20:07 crc kubenswrapper[4786]: E0313 12:20:07.441335 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:20:16 crc kubenswrapper[4786]: I0313 12:20:16.813626 4786 scope.go:117] "RemoveContainer" containerID="06f6ba5eb3ee12cc3786bb0d68962fbb56f0f50aa7c77330b87079d6da1c0f94" Mar 13 12:20:19 crc kubenswrapper[4786]: I0313 12:20:19.440847 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:20:20 crc kubenswrapper[4786]: I0313 12:20:20.095457 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"107ba765aad702f8559e4685fe2234f75237757d40ac2a4a7a7cceb570b17bf1"} Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.150704 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556742-bvpzq"] Mar 13 12:22:00 crc kubenswrapper[4786]: E0313 12:22:00.151544 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1650bdb1-4fde-4823-96eb-5f2b8a273eba" containerName="oc" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.151557 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1650bdb1-4fde-4823-96eb-5f2b8a273eba" containerName="oc" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.151719 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1650bdb1-4fde-4823-96eb-5f2b8a273eba" containerName="oc" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.152204 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.154783 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.154841 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.154985 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.158017 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-bvpzq"] Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.268107 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pwmv\" (UniqueName: \"kubernetes.io/projected/d540f2b8-0d3d-49a8-a69f-20f831a527de-kube-api-access-4pwmv\") pod \"auto-csr-approver-29556742-bvpzq\" (UID: \"d540f2b8-0d3d-49a8-a69f-20f831a527de\") " pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.370118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pwmv\" (UniqueName: \"kubernetes.io/projected/d540f2b8-0d3d-49a8-a69f-20f831a527de-kube-api-access-4pwmv\") pod \"auto-csr-approver-29556742-bvpzq\" (UID: \"d540f2b8-0d3d-49a8-a69f-20f831a527de\") " pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.387901 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pwmv\" (UniqueName: \"kubernetes.io/projected/d540f2b8-0d3d-49a8-a69f-20f831a527de-kube-api-access-4pwmv\") pod \"auto-csr-approver-29556742-bvpzq\" (UID: \"d540f2b8-0d3d-49a8-a69f-20f831a527de\") " pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.469224 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.690950 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-bvpzq"] Mar 13 12:22:00 crc kubenswrapper[4786]: I0313 12:22:00.889829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" event={"ID":"d540f2b8-0d3d-49a8-a69f-20f831a527de","Type":"ContainerStarted","Data":"18e4e2b7aa2ef2f3213afeb3c79267783b62b6e3d39a8bc14b93b26cfb013729"} Mar 13 12:22:02 crc kubenswrapper[4786]: I0313 12:22:02.905985 4786 generic.go:334] "Generic (PLEG): container finished" podID="d540f2b8-0d3d-49a8-a69f-20f831a527de" containerID="a4ff15a12184a54374f01e28398c0774a3ee809e272517aaa658f6c7de97382e" exitCode=0 Mar 13 12:22:02 crc kubenswrapper[4786]: I0313 12:22:02.906200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" event={"ID":"d540f2b8-0d3d-49a8-a69f-20f831a527de","Type":"ContainerDied","Data":"a4ff15a12184a54374f01e28398c0774a3ee809e272517aaa658f6c7de97382e"} Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.181643 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.331395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pwmv\" (UniqueName: \"kubernetes.io/projected/d540f2b8-0d3d-49a8-a69f-20f831a527de-kube-api-access-4pwmv\") pod \"d540f2b8-0d3d-49a8-a69f-20f831a527de\" (UID: \"d540f2b8-0d3d-49a8-a69f-20f831a527de\") " Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.340075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d540f2b8-0d3d-49a8-a69f-20f831a527de-kube-api-access-4pwmv" (OuterVolumeSpecName: "kube-api-access-4pwmv") pod "d540f2b8-0d3d-49a8-a69f-20f831a527de" (UID: "d540f2b8-0d3d-49a8-a69f-20f831a527de"). InnerVolumeSpecName "kube-api-access-4pwmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.433766 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pwmv\" (UniqueName: \"kubernetes.io/projected/d540f2b8-0d3d-49a8-a69f-20f831a527de-kube-api-access-4pwmv\") on node \"crc\" DevicePath \"\"" Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.940178 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" event={"ID":"d540f2b8-0d3d-49a8-a69f-20f831a527de","Type":"ContainerDied","Data":"18e4e2b7aa2ef2f3213afeb3c79267783b62b6e3d39a8bc14b93b26cfb013729"} Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.940230 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18e4e2b7aa2ef2f3213afeb3c79267783b62b6e3d39a8bc14b93b26cfb013729" Mar 13 12:22:04 crc kubenswrapper[4786]: I0313 12:22:04.940332 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-bvpzq" Mar 13 12:22:05 crc kubenswrapper[4786]: I0313 12:22:05.261576 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-dhfvw"] Mar 13 12:22:05 crc kubenswrapper[4786]: I0313 12:22:05.270703 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-dhfvw"] Mar 13 12:22:05 crc kubenswrapper[4786]: I0313 12:22:05.448861 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a324a1-864f-45b2-9932-058bed4ae9e2" path="/var/lib/kubelet/pods/78a324a1-864f-45b2-9932-058bed4ae9e2/volumes" Mar 13 12:22:16 crc kubenswrapper[4786]: I0313 12:22:16.896282 4786 scope.go:117] "RemoveContainer" containerID="c2e095477616881d112fb30963985bde7f9d10c33ae01bde5de8706d3f8edf15" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.759220 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vnnlz"] Mar 13 12:22:17 crc kubenswrapper[4786]: E0313 12:22:17.759860 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d540f2b8-0d3d-49a8-a69f-20f831a527de" containerName="oc" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.759896 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d540f2b8-0d3d-49a8-a69f-20f831a527de" containerName="oc" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.760080 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d540f2b8-0d3d-49a8-a69f-20f831a527de" containerName="oc" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.761373 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.765592 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnnlz"] Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.913334 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-catalog-content\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.914112 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbjp\" (UniqueName: \"kubernetes.io/projected/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-kube-api-access-kfbjp\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:17 crc kubenswrapper[4786]: I0313 12:22:17.914234 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-utilities\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.015102 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-utilities\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.015192 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-catalog-content\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.015240 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbjp\" (UniqueName: \"kubernetes.io/projected/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-kube-api-access-kfbjp\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.015625 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-catalog-content\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.015977 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-utilities\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.038767 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbjp\" (UniqueName: \"kubernetes.io/projected/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-kube-api-access-kfbjp\") pod \"community-operators-vnnlz\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.088644 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.585289 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnnlz"] Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.824996 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerID="bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e" exitCode=0 Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.825035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerDied","Data":"bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e"} Mar 13 12:22:18 crc kubenswrapper[4786]: I0313 12:22:18.825060 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerStarted","Data":"b2a191814d3ff9e2174bcd4c652b8ddc58fe4d8ca0177386e886f526c2cf4361"} Mar 13 12:22:19 crc kubenswrapper[4786]: I0313 12:22:19.833212 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerStarted","Data":"14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341"} Mar 13 12:22:20 crc kubenswrapper[4786]: I0313 12:22:20.847006 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerID="14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341" exitCode=0 Mar 13 12:22:20 crc kubenswrapper[4786]: I0313 12:22:20.847131 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerDied","Data":"14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341"} Mar 13 12:22:22 crc kubenswrapper[4786]: I0313 12:22:22.862074 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerStarted","Data":"efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b"} Mar 13 12:22:22 crc kubenswrapper[4786]: I0313 12:22:22.883848 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vnnlz" podStartSLOduration=2.935791706 podStartE2EDuration="5.883832461s" podCreationTimestamp="2026-03-13 12:22:17 +0000 UTC" firstStartedPulling="2026-03-13 12:22:18.826334107 +0000 UTC m=+2126.105987554" lastFinishedPulling="2026-03-13 12:22:21.774374852 +0000 UTC m=+2129.054028309" observedRunningTime="2026-03-13 12:22:22.879039301 +0000 UTC m=+2130.158692768" watchObservedRunningTime="2026-03-13 12:22:22.883832461 +0000 UTC m=+2130.163485908" Mar 13 12:22:28 crc kubenswrapper[4786]: I0313 12:22:28.089571 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:28 crc kubenswrapper[4786]: I0313 12:22:28.089921 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:28 crc kubenswrapper[4786]: I0313 12:22:28.141364 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:28 crc kubenswrapper[4786]: I0313 12:22:28.969743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:29 crc kubenswrapper[4786]: I0313 12:22:29.039345 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnnlz"] Mar 13 12:22:30 crc kubenswrapper[4786]: I0313 12:22:30.927945 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vnnlz" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="registry-server" containerID="cri-o://efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b" gracePeriod=2 Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.346135 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.436591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-catalog-content\") pod \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.436653 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-utilities\") pod \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.436742 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfbjp\" (UniqueName: \"kubernetes.io/projected/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-kube-api-access-kfbjp\") pod \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\" (UID: \"a1ab65c0-e791-43f7-a634-8c3c6b2836ec\") " Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.437719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-utilities" (OuterVolumeSpecName: "utilities") pod "a1ab65c0-e791-43f7-a634-8c3c6b2836ec" (UID: "a1ab65c0-e791-43f7-a634-8c3c6b2836ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.443130 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-kube-api-access-kfbjp" (OuterVolumeSpecName: "kube-api-access-kfbjp") pod "a1ab65c0-e791-43f7-a634-8c3c6b2836ec" (UID: "a1ab65c0-e791-43f7-a634-8c3c6b2836ec"). InnerVolumeSpecName "kube-api-access-kfbjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.498333 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1ab65c0-e791-43f7-a634-8c3c6b2836ec" (UID: "a1ab65c0-e791-43f7-a634-8c3c6b2836ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.538679 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.538732 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.538753 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfbjp\" (UniqueName: \"kubernetes.io/projected/a1ab65c0-e791-43f7-a634-8c3c6b2836ec-kube-api-access-kfbjp\") on node \"crc\" DevicePath \"\"" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.937189 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerID="efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b" exitCode=0 Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.937236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerDied","Data":"efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b"} Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.937263 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnnlz" event={"ID":"a1ab65c0-e791-43f7-a634-8c3c6b2836ec","Type":"ContainerDied","Data":"b2a191814d3ff9e2174bcd4c652b8ddc58fe4d8ca0177386e886f526c2cf4361"} Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.937280 4786 scope.go:117] "RemoveContainer" containerID="efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.937289 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnnlz" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.965394 4786 scope.go:117] "RemoveContainer" containerID="14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341" Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.970997 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnnlz"] Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.976406 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vnnlz"] Mar 13 12:22:31 crc kubenswrapper[4786]: I0313 12:22:31.996195 4786 scope.go:117] "RemoveContainer" containerID="bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e" Mar 13 12:22:32 crc kubenswrapper[4786]: I0313 12:22:32.012865 4786 scope.go:117] "RemoveContainer" containerID="efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b" Mar 13 12:22:32 crc kubenswrapper[4786]: E0313 12:22:32.013392 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b\": container with ID starting with efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b not found: ID does not exist" containerID="efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b" Mar 13 12:22:32 crc kubenswrapper[4786]: I0313 12:22:32.013440 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b"} err="failed to get container status \"efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b\": rpc error: code = NotFound desc = could not find container \"efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b\": container with ID starting with efcff09d67f567a237f0c0ae7939c70c2ce917b9aa7a618ac3dabd59afb2de0b not found: ID does not exist" Mar 13 12:22:32 crc kubenswrapper[4786]: I0313 12:22:32.013469 4786 scope.go:117] "RemoveContainer" containerID="14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341" Mar 13 12:22:32 crc kubenswrapper[4786]: E0313 12:22:32.013842 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341\": container with ID starting with 14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341 not found: ID does not exist" containerID="14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341" Mar 13 12:22:32 crc kubenswrapper[4786]: I0313 12:22:32.013867 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341"} err="failed to get container status \"14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341\": rpc error: code = NotFound desc = could not find container \"14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341\": container with ID starting with 14243e0101563a27201bbdfa7aee7aba2c1160d895b08bed3719a7a0c6b72341 not found: ID does not exist" Mar 13 12:22:32 crc kubenswrapper[4786]: I0313 12:22:32.013939 4786 scope.go:117] "RemoveContainer" containerID="bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e" Mar 13 12:22:32 crc kubenswrapper[4786]: E0313 12:22:32.014247 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e\": container with ID starting with bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e not found: ID does not exist" containerID="bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e" Mar 13 12:22:32 crc kubenswrapper[4786]: I0313 12:22:32.014306 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e"} err="failed to get container status \"bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e\": rpc error: code = NotFound desc = could not find container \"bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e\": container with ID starting with bfe5c98c7aa0a9b2d9b8bb82d0f89da8bbfee9dc46cb92bf03eaa6e7ee6d602e not found: ID does not exist" Mar 13 12:22:33 crc kubenswrapper[4786]: I0313 12:22:33.447801 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" path="/var/lib/kubelet/pods/a1ab65c0-e791-43f7-a634-8c3c6b2836ec/volumes" Mar 13 12:22:38 crc kubenswrapper[4786]: I0313 12:22:38.170019 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:22:38 crc kubenswrapper[4786]: I0313 12:22:38.170428 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:23:08 crc kubenswrapper[4786]: I0313 12:23:08.169279 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:23:08 crc kubenswrapper[4786]: I0313 12:23:08.169813 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.169516 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.170785 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.170962 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.171597 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"107ba765aad702f8559e4685fe2234f75237757d40ac2a4a7a7cceb570b17bf1"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.171787 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://107ba765aad702f8559e4685fe2234f75237757d40ac2a4a7a7cceb570b17bf1" gracePeriod=600 Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.447066 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="107ba765aad702f8559e4685fe2234f75237757d40ac2a4a7a7cceb570b17bf1" exitCode=0 Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.447100 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"107ba765aad702f8559e4685fe2234f75237757d40ac2a4a7a7cceb570b17bf1"} Mar 13 12:23:38 crc kubenswrapper[4786]: I0313 12:23:38.447129 4786 scope.go:117] "RemoveContainer" containerID="2dca62fc2e34c0c93923ea030f2c40f5b75e9df9147c5189e95297a72ac8e754" Mar 13 12:23:40 crc kubenswrapper[4786]: I0313 12:23:40.465545 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f"} Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.151362 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556744-c6l45"] Mar 13 12:24:00 crc kubenswrapper[4786]: E0313 12:24:00.152364 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="extract-content" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.152382 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="extract-content" Mar 13 12:24:00 crc kubenswrapper[4786]: E0313 12:24:00.152403 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="extract-utilities" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.152411 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="extract-utilities" Mar 13 12:24:00 crc kubenswrapper[4786]: E0313 12:24:00.152434 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="registry-server" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.152442 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="registry-server" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.152634 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ab65c0-e791-43f7-a634-8c3c6b2836ec" containerName="registry-server" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.153232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.157676 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.158078 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.158344 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.171860 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-c6l45"] Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.217403 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gzn\" (UniqueName: \"kubernetes.io/projected/4d6c7377-4631-4d28-9983-7a43fbee8c3e-kube-api-access-28gzn\") pod \"auto-csr-approver-29556744-c6l45\" (UID: \"4d6c7377-4631-4d28-9983-7a43fbee8c3e\") " pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.319001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gzn\" (UniqueName: \"kubernetes.io/projected/4d6c7377-4631-4d28-9983-7a43fbee8c3e-kube-api-access-28gzn\") pod \"auto-csr-approver-29556744-c6l45\" (UID: \"4d6c7377-4631-4d28-9983-7a43fbee8c3e\") " pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.339416 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gzn\" (UniqueName: \"kubernetes.io/projected/4d6c7377-4631-4d28-9983-7a43fbee8c3e-kube-api-access-28gzn\") pod \"auto-csr-approver-29556744-c6l45\" (UID: \"4d6c7377-4631-4d28-9983-7a43fbee8c3e\") " pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.486108 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.945149 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-c6l45"] Mar 13 12:24:00 crc kubenswrapper[4786]: I0313 12:24:00.952407 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:24:01 crc kubenswrapper[4786]: I0313 12:24:01.604536 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-c6l45" event={"ID":"4d6c7377-4631-4d28-9983-7a43fbee8c3e","Type":"ContainerStarted","Data":"07ecc72e4e19ff0aac03d4397e9b4156e57048aa2887fe9984819ed2d494ed24"} Mar 13 12:24:03 crc kubenswrapper[4786]: I0313 12:24:03.620638 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d6c7377-4631-4d28-9983-7a43fbee8c3e" containerID="1516b8433a9509bf7e7a6a00f912f6911ce3c22a6cf5cb2bec40001954dc9889" exitCode=0 Mar 13 12:24:03 crc kubenswrapper[4786]: I0313 12:24:03.620791 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-c6l45" event={"ID":"4d6c7377-4631-4d28-9983-7a43fbee8c3e","Type":"ContainerDied","Data":"1516b8433a9509bf7e7a6a00f912f6911ce3c22a6cf5cb2bec40001954dc9889"} Mar 13 12:24:04 crc kubenswrapper[4786]: I0313 12:24:04.950070 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:04 crc kubenswrapper[4786]: I0313 12:24:04.983669 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28gzn\" (UniqueName: \"kubernetes.io/projected/4d6c7377-4631-4d28-9983-7a43fbee8c3e-kube-api-access-28gzn\") pod \"4d6c7377-4631-4d28-9983-7a43fbee8c3e\" (UID: \"4d6c7377-4631-4d28-9983-7a43fbee8c3e\") " Mar 13 12:24:04 crc kubenswrapper[4786]: I0313 12:24:04.993199 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6c7377-4631-4d28-9983-7a43fbee8c3e-kube-api-access-28gzn" (OuterVolumeSpecName: "kube-api-access-28gzn") pod "4d6c7377-4631-4d28-9983-7a43fbee8c3e" (UID: "4d6c7377-4631-4d28-9983-7a43fbee8c3e"). InnerVolumeSpecName "kube-api-access-28gzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:24:05 crc kubenswrapper[4786]: I0313 12:24:05.085131 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28gzn\" (UniqueName: \"kubernetes.io/projected/4d6c7377-4631-4d28-9983-7a43fbee8c3e-kube-api-access-28gzn\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:05 crc kubenswrapper[4786]: I0313 12:24:05.641248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-c6l45" event={"ID":"4d6c7377-4631-4d28-9983-7a43fbee8c3e","Type":"ContainerDied","Data":"07ecc72e4e19ff0aac03d4397e9b4156e57048aa2887fe9984819ed2d494ed24"} Mar 13 12:24:05 crc kubenswrapper[4786]: I0313 12:24:05.641287 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07ecc72e4e19ff0aac03d4397e9b4156e57048aa2887fe9984819ed2d494ed24" Mar 13 12:24:05 crc kubenswrapper[4786]: I0313 12:24:05.641308 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-c6l45" Mar 13 12:24:06 crc kubenswrapper[4786]: I0313 12:24:06.007642 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-m5xp8"] Mar 13 12:24:06 crc kubenswrapper[4786]: I0313 12:24:06.012848 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-m5xp8"] Mar 13 12:24:07 crc kubenswrapper[4786]: I0313 12:24:07.449503 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6296ace-d6a9-45e3-81ac-20c708bc3588" path="/var/lib/kubelet/pods/f6296ace-d6a9-45e3-81ac-20c708bc3588/volumes" Mar 13 12:24:16 crc kubenswrapper[4786]: I0313 12:24:16.977596 4786 scope.go:117] "RemoveContainer" containerID="98e46c3b3f266db9d9db2ab4bf3b3572a4b24c93091763d0c4c6e10669d44227" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.139362 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556746-s7xvx"] Mar 13 12:26:00 crc kubenswrapper[4786]: E0313 12:26:00.140134 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6c7377-4631-4d28-9983-7a43fbee8c3e" containerName="oc" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.140147 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6c7377-4631-4d28-9983-7a43fbee8c3e" containerName="oc" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.140334 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6c7377-4631-4d28-9983-7a43fbee8c3e" containerName="oc" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.140747 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.142827 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.143113 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.144100 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.148246 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-s7xvx"] Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.200970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrr48\" (UniqueName: \"kubernetes.io/projected/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703-kube-api-access-mrr48\") pod \"auto-csr-approver-29556746-s7xvx\" (UID: \"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703\") " pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.302555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrr48\" (UniqueName: \"kubernetes.io/projected/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703-kube-api-access-mrr48\") pod \"auto-csr-approver-29556746-s7xvx\" (UID: \"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703\") " pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.329649 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrr48\" (UniqueName: \"kubernetes.io/projected/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703-kube-api-access-mrr48\") pod \"auto-csr-approver-29556746-s7xvx\" (UID: \"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703\") " pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.459671 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.692082 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-s7xvx"] Mar 13 12:26:00 crc kubenswrapper[4786]: I0313 12:26:00.805659 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" event={"ID":"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703","Type":"ContainerStarted","Data":"a1f44a500cb6ec2796571b986b1b2b411eb775f617c8adbe6fbbe84f5b7273ad"} Mar 13 12:26:01 crc kubenswrapper[4786]: I0313 12:26:01.815560 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" event={"ID":"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703","Type":"ContainerStarted","Data":"a019deac1eb439493d89c5e62810f2215beac2152d1faa2a84ce9410562f30f5"} Mar 13 12:26:01 crc kubenswrapper[4786]: I0313 12:26:01.830599 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" podStartSLOduration=1.004054298 podStartE2EDuration="1.830572087s" podCreationTimestamp="2026-03-13 12:26:00 +0000 UTC" firstStartedPulling="2026-03-13 12:26:00.700020927 +0000 UTC m=+2347.979674374" lastFinishedPulling="2026-03-13 12:26:01.526538716 +0000 UTC m=+2348.806192163" observedRunningTime="2026-03-13 12:26:01.829253811 +0000 UTC m=+2349.108907258" watchObservedRunningTime="2026-03-13 12:26:01.830572087 +0000 UTC m=+2349.110225544" Mar 13 12:26:02 crc kubenswrapper[4786]: I0313 12:26:02.824121 4786 generic.go:334] "Generic (PLEG): container finished" podID="a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703" containerID="a019deac1eb439493d89c5e62810f2215beac2152d1faa2a84ce9410562f30f5" exitCode=0 Mar 13 12:26:02 crc kubenswrapper[4786]: I0313 12:26:02.824242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" event={"ID":"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703","Type":"ContainerDied","Data":"a019deac1eb439493d89c5e62810f2215beac2152d1faa2a84ce9410562f30f5"} Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.163199 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.265113 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrr48\" (UniqueName: \"kubernetes.io/projected/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703-kube-api-access-mrr48\") pod \"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703\" (UID: \"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703\") " Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.271751 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703-kube-api-access-mrr48" (OuterVolumeSpecName: "kube-api-access-mrr48") pod "a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703" (UID: "a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703"). InnerVolumeSpecName "kube-api-access-mrr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.366411 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrr48\" (UniqueName: \"kubernetes.io/projected/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703-kube-api-access-mrr48\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.842733 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" event={"ID":"a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703","Type":"ContainerDied","Data":"a1f44a500cb6ec2796571b986b1b2b411eb775f617c8adbe6fbbe84f5b7273ad"} Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.842780 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1f44a500cb6ec2796571b986b1b2b411eb775f617c8adbe6fbbe84f5b7273ad" Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.842843 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-s7xvx" Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.899428 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-mwm65"] Mar 13 12:26:04 crc kubenswrapper[4786]: I0313 12:26:04.904533 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-mwm65"] Mar 13 12:26:05 crc kubenswrapper[4786]: I0313 12:26:05.449958 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1650bdb1-4fde-4823-96eb-5f2b8a273eba" path="/var/lib/kubelet/pods/1650bdb1-4fde-4823-96eb-5f2b8a273eba/volumes" Mar 13 12:26:08 crc kubenswrapper[4786]: I0313 12:26:08.168871 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:26:08 crc kubenswrapper[4786]: I0313 12:26:08.169163 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:26:17 crc kubenswrapper[4786]: I0313 12:26:17.052623 4786 scope.go:117] "RemoveContainer" containerID="e06be280fff84559113fc0537f29bfbb88391f35f68a8540cbcb0a93adbeb97b" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.188978 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k6n84"] Mar 13 12:26:30 crc kubenswrapper[4786]: E0313 12:26:30.190099 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703" containerName="oc" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.190124 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703" containerName="oc" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.190343 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703" containerName="oc" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.191976 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.197988 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6n84"] Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.231858 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-utilities\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.231928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzg4\" (UniqueName: \"kubernetes.io/projected/56948ae4-66f2-4db9-b547-47dc7819d2bf-kube-api-access-pjzg4\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.231988 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-catalog-content\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.332421 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-catalog-content\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.332537 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-utilities\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.332557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzg4\" (UniqueName: \"kubernetes.io/projected/56948ae4-66f2-4db9-b547-47dc7819d2bf-kube-api-access-pjzg4\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.333044 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-catalog-content\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.333128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-utilities\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.353160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzg4\" (UniqueName: \"kubernetes.io/projected/56948ae4-66f2-4db9-b547-47dc7819d2bf-kube-api-access-pjzg4\") pod \"redhat-marketplace-k6n84\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.508513 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:30 crc kubenswrapper[4786]: I0313 12:26:30.932413 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6n84"] Mar 13 12:26:31 crc kubenswrapper[4786]: I0313 12:26:31.038011 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6n84" event={"ID":"56948ae4-66f2-4db9-b547-47dc7819d2bf","Type":"ContainerStarted","Data":"59e580bb19a110b2a05f0fd1e7023287c7af209da27123892aeff1a99585e762"} Mar 13 12:26:32 crc kubenswrapper[4786]: I0313 12:26:32.048677 4786 generic.go:334] "Generic (PLEG): container finished" podID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerID="e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b" exitCode=0 Mar 13 12:26:32 crc kubenswrapper[4786]: I0313 12:26:32.048846 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6n84" event={"ID":"56948ae4-66f2-4db9-b547-47dc7819d2bf","Type":"ContainerDied","Data":"e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b"} Mar 13 12:26:34 crc kubenswrapper[4786]: I0313 12:26:34.073458 4786 generic.go:334] "Generic (PLEG): container finished" podID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerID="a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff" exitCode=0 Mar 13 12:26:34 crc kubenswrapper[4786]: I0313 12:26:34.073521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6n84" event={"ID":"56948ae4-66f2-4db9-b547-47dc7819d2bf","Type":"ContainerDied","Data":"a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff"} Mar 13 12:26:35 crc kubenswrapper[4786]: I0313 12:26:35.084789 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6n84" event={"ID":"56948ae4-66f2-4db9-b547-47dc7819d2bf","Type":"ContainerStarted","Data":"2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83"} Mar 13 12:26:35 crc kubenswrapper[4786]: I0313 12:26:35.115617 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k6n84" podStartSLOduration=2.587579622 podStartE2EDuration="5.115592955s" podCreationTimestamp="2026-03-13 12:26:30 +0000 UTC" firstStartedPulling="2026-03-13 12:26:32.050489827 +0000 UTC m=+2379.330143264" lastFinishedPulling="2026-03-13 12:26:34.57850314 +0000 UTC m=+2381.858156597" observedRunningTime="2026-03-13 12:26:35.106754735 +0000 UTC m=+2382.386408232" watchObservedRunningTime="2026-03-13 12:26:35.115592955 +0000 UTC m=+2382.395246422" Mar 13 12:26:38 crc kubenswrapper[4786]: I0313 12:26:38.168980 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:26:38 crc kubenswrapper[4786]: I0313 12:26:38.170798 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:26:40 crc kubenswrapper[4786]: I0313 12:26:40.509345 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:40 crc kubenswrapper[4786]: I0313 12:26:40.509605 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:40 crc kubenswrapper[4786]: I0313 12:26:40.559376 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:41 crc kubenswrapper[4786]: I0313 12:26:41.173021 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:41 crc kubenswrapper[4786]: I0313 12:26:41.215656 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6n84"] Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.149830 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k6n84" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="registry-server" containerID="cri-o://2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83" gracePeriod=2 Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.561809 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.721564 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-utilities\") pod \"56948ae4-66f2-4db9-b547-47dc7819d2bf\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.721619 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-catalog-content\") pod \"56948ae4-66f2-4db9-b547-47dc7819d2bf\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.721739 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzg4\" (UniqueName: \"kubernetes.io/projected/56948ae4-66f2-4db9-b547-47dc7819d2bf-kube-api-access-pjzg4\") pod \"56948ae4-66f2-4db9-b547-47dc7819d2bf\" (UID: \"56948ae4-66f2-4db9-b547-47dc7819d2bf\") " Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.723168 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-utilities" (OuterVolumeSpecName: "utilities") pod "56948ae4-66f2-4db9-b547-47dc7819d2bf" (UID: "56948ae4-66f2-4db9-b547-47dc7819d2bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.729475 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56948ae4-66f2-4db9-b547-47dc7819d2bf-kube-api-access-pjzg4" (OuterVolumeSpecName: "kube-api-access-pjzg4") pod "56948ae4-66f2-4db9-b547-47dc7819d2bf" (UID: "56948ae4-66f2-4db9-b547-47dc7819d2bf"). InnerVolumeSpecName "kube-api-access-pjzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.748236 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56948ae4-66f2-4db9-b547-47dc7819d2bf" (UID: "56948ae4-66f2-4db9-b547-47dc7819d2bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.824144 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.824180 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56948ae4-66f2-4db9-b547-47dc7819d2bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:43 crc kubenswrapper[4786]: I0313 12:26:43.824191 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzg4\" (UniqueName: \"kubernetes.io/projected/56948ae4-66f2-4db9-b547-47dc7819d2bf-kube-api-access-pjzg4\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.159665 4786 generic.go:334] "Generic (PLEG): container finished" podID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerID="2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83" exitCode=0 Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.159714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6n84" event={"ID":"56948ae4-66f2-4db9-b547-47dc7819d2bf","Type":"ContainerDied","Data":"2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83"} Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.159786 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6n84" event={"ID":"56948ae4-66f2-4db9-b547-47dc7819d2bf","Type":"ContainerDied","Data":"59e580bb19a110b2a05f0fd1e7023287c7af209da27123892aeff1a99585e762"} Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.159755 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6n84" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.159823 4786 scope.go:117] "RemoveContainer" containerID="2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.182067 4786 scope.go:117] "RemoveContainer" containerID="a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.197860 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6n84"] Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.205301 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6n84"] Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.228381 4786 scope.go:117] "RemoveContainer" containerID="e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.243173 4786 scope.go:117] "RemoveContainer" containerID="2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83" Mar 13 12:26:44 crc kubenswrapper[4786]: E0313 12:26:44.243633 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83\": container with ID starting with 2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83 not found: ID does not exist" containerID="2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.243694 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83"} err="failed to get container status \"2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83\": rpc error: code = NotFound desc = could not find container \"2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83\": container with ID starting with 2e6e1e0031bf34e00195a7dc0a95b1b7fe3833713d0d37da62edea4e62badb83 not found: ID does not exist" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.243755 4786 scope.go:117] "RemoveContainer" containerID="a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff" Mar 13 12:26:44 crc kubenswrapper[4786]: E0313 12:26:44.244164 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff\": container with ID starting with a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff not found: ID does not exist" containerID="a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.244213 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff"} err="failed to get container status \"a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff\": rpc error: code = NotFound desc = could not find container \"a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff\": container with ID starting with a32e5c30552d8b44b27c005db64e2103719c6b1d8836b47335bb8b7e0404c4ff not found: ID does not exist" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.244238 4786 scope.go:117] "RemoveContainer" containerID="e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b" Mar 13 12:26:44 crc kubenswrapper[4786]: E0313 12:26:44.244537 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b\": container with ID starting with e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b not found: ID does not exist" containerID="e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b" Mar 13 12:26:44 crc kubenswrapper[4786]: I0313 12:26:44.244563 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b"} err="failed to get container status \"e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b\": rpc error: code = NotFound desc = could not find container \"e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b\": container with ID starting with e401b794e2e17480cb17825c8249f6ec80be3cb9c8e4881d6c15e703a876261b not found: ID does not exist" Mar 13 12:26:45 crc kubenswrapper[4786]: I0313 12:26:45.449674 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" path="/var/lib/kubelet/pods/56948ae4-66f2-4db9-b547-47dc7819d2bf/volumes" Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.168658 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.169264 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.169307 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.169904 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.169952 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" gracePeriod=600 Mar 13 12:27:08 crc kubenswrapper[4786]: E0313 12:27:08.294297 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.391408 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" exitCode=0 Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.391514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f"} Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.391690 4786 scope.go:117] "RemoveContainer" containerID="107ba765aad702f8559e4685fe2234f75237757d40ac2a4a7a7cceb570b17bf1" Mar 13 12:27:08 crc kubenswrapper[4786]: I0313 12:27:08.392250 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:27:08 crc kubenswrapper[4786]: E0313 12:27:08.393596 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:27:22 crc kubenswrapper[4786]: I0313 12:27:22.441122 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:27:22 crc kubenswrapper[4786]: E0313 12:27:22.442045 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:27:34 crc kubenswrapper[4786]: I0313 12:27:34.441461 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:27:34 crc kubenswrapper[4786]: E0313 12:27:34.442033 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:27:47 crc kubenswrapper[4786]: I0313 12:27:47.441246 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:27:47 crc kubenswrapper[4786]: E0313 12:27:47.441839 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.155961 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556748-4qlwm"] Mar 13 12:28:00 crc kubenswrapper[4786]: E0313 12:28:00.157391 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="registry-server" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.157424 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="registry-server" Mar 13 12:28:00 crc kubenswrapper[4786]: E0313 12:28:00.157458 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="extract-content" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.157475 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="extract-content" Mar 13 12:28:00 crc kubenswrapper[4786]: E0313 12:28:00.157524 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="extract-utilities" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.157544 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="extract-utilities" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.157874 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="56948ae4-66f2-4db9-b547-47dc7819d2bf" containerName="registry-server" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.158967 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.162228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk9vd\" (UniqueName: \"kubernetes.io/projected/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6-kube-api-access-qk9vd\") pod \"auto-csr-approver-29556748-4qlwm\" (UID: \"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6\") " pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.165490 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.165745 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.165776 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.169657 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-4qlwm"] Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.263560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk9vd\" (UniqueName: \"kubernetes.io/projected/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6-kube-api-access-qk9vd\") pod \"auto-csr-approver-29556748-4qlwm\" (UID: \"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6\") " pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.283275 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk9vd\" (UniqueName: \"kubernetes.io/projected/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6-kube-api-access-qk9vd\") pod \"auto-csr-approver-29556748-4qlwm\" (UID: \"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6\") " pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.491482 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:00 crc kubenswrapper[4786]: I0313 12:28:00.930769 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-4qlwm"] Mar 13 12:28:01 crc kubenswrapper[4786]: I0313 12:28:01.238585 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" event={"ID":"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6","Type":"ContainerStarted","Data":"29d35eb19376b02ba1d64a3f8d0b7ef9d2cd1518f6513aa1b43203e4830c24c0"} Mar 13 12:28:02 crc kubenswrapper[4786]: I0313 12:28:02.440872 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:28:02 crc kubenswrapper[4786]: E0313 12:28:02.441529 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:28:03 crc kubenswrapper[4786]: I0313 12:28:03.251761 4786 generic.go:334] "Generic (PLEG): container finished" podID="80fd49ea-8f19-4ffa-9ef5-298c34c4dea6" containerID="93d1dbd4768d90ec32952728df37983b9dfa60a30a38c4ec7e7389b29913417f" exitCode=0 Mar 13 12:28:03 crc kubenswrapper[4786]: I0313 12:28:03.251852 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" event={"ID":"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6","Type":"ContainerDied","Data":"93d1dbd4768d90ec32952728df37983b9dfa60a30a38c4ec7e7389b29913417f"} Mar 13 12:28:04 crc kubenswrapper[4786]: I0313 12:28:04.627024 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:04 crc kubenswrapper[4786]: I0313 12:28:04.728455 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk9vd\" (UniqueName: \"kubernetes.io/projected/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6-kube-api-access-qk9vd\") pod \"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6\" (UID: \"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6\") " Mar 13 12:28:04 crc kubenswrapper[4786]: I0313 12:28:04.739095 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6-kube-api-access-qk9vd" (OuterVolumeSpecName: "kube-api-access-qk9vd") pod "80fd49ea-8f19-4ffa-9ef5-298c34c4dea6" (UID: "80fd49ea-8f19-4ffa-9ef5-298c34c4dea6"). InnerVolumeSpecName "kube-api-access-qk9vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:28:04 crc kubenswrapper[4786]: I0313 12:28:04.829902 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk9vd\" (UniqueName: \"kubernetes.io/projected/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6-kube-api-access-qk9vd\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:05 crc kubenswrapper[4786]: I0313 12:28:05.267658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" event={"ID":"80fd49ea-8f19-4ffa-9ef5-298c34c4dea6","Type":"ContainerDied","Data":"29d35eb19376b02ba1d64a3f8d0b7ef9d2cd1518f6513aa1b43203e4830c24c0"} Mar 13 12:28:05 crc kubenswrapper[4786]: I0313 12:28:05.267702 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d35eb19376b02ba1d64a3f8d0b7ef9d2cd1518f6513aa1b43203e4830c24c0" Mar 13 12:28:05 crc kubenswrapper[4786]: I0313 12:28:05.267700 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-4qlwm" Mar 13 12:28:05 crc kubenswrapper[4786]: I0313 12:28:05.694201 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-bvpzq"] Mar 13 12:28:05 crc kubenswrapper[4786]: I0313 12:28:05.702913 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-bvpzq"] Mar 13 12:28:07 crc kubenswrapper[4786]: I0313 12:28:07.450835 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d540f2b8-0d3d-49a8-a69f-20f831a527de" path="/var/lib/kubelet/pods/d540f2b8-0d3d-49a8-a69f-20f831a527de/volumes" Mar 13 12:28:13 crc kubenswrapper[4786]: I0313 12:28:13.443845 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:28:13 crc kubenswrapper[4786]: E0313 12:28:13.444546 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.060832 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kbg65"] Mar 13 12:28:14 crc kubenswrapper[4786]: E0313 12:28:14.061675 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fd49ea-8f19-4ffa-9ef5-298c34c4dea6" containerName="oc" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.061693 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fd49ea-8f19-4ffa-9ef5-298c34c4dea6" containerName="oc" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.061874 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fd49ea-8f19-4ffa-9ef5-298c34c4dea6" containerName="oc" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.066231 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.073321 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbg65"] Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.146643 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-utilities\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.146767 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-catalog-content\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.146963 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7lh\" (UniqueName: \"kubernetes.io/projected/d871e656-70a1-40d8-b203-e2344a6b61db-kube-api-access-gj7lh\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.248013 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-catalog-content\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.248105 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7lh\" (UniqueName: \"kubernetes.io/projected/d871e656-70a1-40d8-b203-e2344a6b61db-kube-api-access-gj7lh\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.248139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-utilities\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.248601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-catalog-content\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.248651 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-utilities\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.272880 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7lh\" (UniqueName: \"kubernetes.io/projected/d871e656-70a1-40d8-b203-e2344a6b61db-kube-api-access-gj7lh\") pod \"redhat-operators-kbg65\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.395301 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:14 crc kubenswrapper[4786]: I0313 12:28:14.837017 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbg65"] Mar 13 12:28:15 crc kubenswrapper[4786]: I0313 12:28:15.341952 4786 generic.go:334] "Generic (PLEG): container finished" podID="d871e656-70a1-40d8-b203-e2344a6b61db" containerID="c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a" exitCode=0 Mar 13 12:28:15 crc kubenswrapper[4786]: I0313 12:28:15.342030 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerDied","Data":"c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a"} Mar 13 12:28:15 crc kubenswrapper[4786]: I0313 12:28:15.342088 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerStarted","Data":"9752912d40d2927d7fd86e888b9ee9800fcec7dc06c9aaed98c291c60c0050b9"} Mar 13 12:28:16 crc kubenswrapper[4786]: I0313 12:28:16.351961 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerStarted","Data":"4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa"} Mar 13 12:28:17 crc kubenswrapper[4786]: I0313 12:28:17.164413 4786 scope.go:117] "RemoveContainer" containerID="a4ff15a12184a54374f01e28398c0774a3ee809e272517aaa658f6c7de97382e" Mar 13 12:28:17 crc kubenswrapper[4786]: I0313 12:28:17.360800 4786 generic.go:334] "Generic (PLEG): container finished" podID="d871e656-70a1-40d8-b203-e2344a6b61db" containerID="4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa" exitCode=0 Mar 13 12:28:17 crc kubenswrapper[4786]: I0313 12:28:17.360854 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerDied","Data":"4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa"} Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.375708 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerStarted","Data":"d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5"} Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.401823 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kbg65" podStartSLOduration=2.01309873 podStartE2EDuration="4.401809463s" podCreationTimestamp="2026-03-13 12:28:14 +0000 UTC" firstStartedPulling="2026-03-13 12:28:15.343782646 +0000 UTC m=+2482.623436093" lastFinishedPulling="2026-03-13 12:28:17.732493379 +0000 UTC m=+2485.012146826" observedRunningTime="2026-03-13 12:28:18.399547102 +0000 UTC m=+2485.679200559" watchObservedRunningTime="2026-03-13 12:28:18.401809463 +0000 UTC m=+2485.681462910" Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.843944 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6kxn7"] Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.845553 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.865352 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kxn7"] Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.917540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gkj7\" (UniqueName: \"kubernetes.io/projected/bd71c645-05aa-4637-a976-0575dca0b8b6-kube-api-access-5gkj7\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.917613 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-utilities\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:18 crc kubenswrapper[4786]: I0313 12:28:18.917697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-catalog-content\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.019285 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-catalog-content\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.019364 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gkj7\" (UniqueName: \"kubernetes.io/projected/bd71c645-05aa-4637-a976-0575dca0b8b6-kube-api-access-5gkj7\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.019395 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-utilities\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.020017 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-catalog-content\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.020025 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-utilities\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.056006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gkj7\" (UniqueName: \"kubernetes.io/projected/bd71c645-05aa-4637-a976-0575dca0b8b6-kube-api-access-5gkj7\") pod \"certified-operators-6kxn7\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.163454 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:19 crc kubenswrapper[4786]: I0313 12:28:19.612148 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kxn7"] Mar 13 12:28:19 crc kubenswrapper[4786]: W0313 12:28:19.625172 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd71c645_05aa_4637_a976_0575dca0b8b6.slice/crio-018451f1d732ac85f606cb1ec4063c80bd4ed432df1cc3fa96988edb09d2f028 WatchSource:0}: Error finding container 018451f1d732ac85f606cb1ec4063c80bd4ed432df1cc3fa96988edb09d2f028: Status 404 returned error can't find the container with id 018451f1d732ac85f606cb1ec4063c80bd4ed432df1cc3fa96988edb09d2f028 Mar 13 12:28:20 crc kubenswrapper[4786]: I0313 12:28:20.391925 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerID="6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043" exitCode=0 Mar 13 12:28:20 crc kubenswrapper[4786]: I0313 12:28:20.392018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerDied","Data":"6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043"} Mar 13 12:28:20 crc kubenswrapper[4786]: I0313 12:28:20.392292 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerStarted","Data":"018451f1d732ac85f606cb1ec4063c80bd4ed432df1cc3fa96988edb09d2f028"} Mar 13 12:28:21 crc kubenswrapper[4786]: I0313 12:28:21.401936 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerStarted","Data":"7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc"} Mar 13 12:28:22 crc kubenswrapper[4786]: I0313 12:28:22.410476 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerID="7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc" exitCode=0 Mar 13 12:28:22 crc kubenswrapper[4786]: I0313 12:28:22.410533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerDied","Data":"7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc"} Mar 13 12:28:23 crc kubenswrapper[4786]: I0313 12:28:23.419654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerStarted","Data":"d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b"} Mar 13 12:28:23 crc kubenswrapper[4786]: I0313 12:28:23.439814 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6kxn7" podStartSLOduration=3.00502603 podStartE2EDuration="5.439795413s" podCreationTimestamp="2026-03-13 12:28:18 +0000 UTC" firstStartedPulling="2026-03-13 12:28:20.393489749 +0000 UTC m=+2487.673143196" lastFinishedPulling="2026-03-13 12:28:22.828259132 +0000 UTC m=+2490.107912579" observedRunningTime="2026-03-13 12:28:23.43496756 +0000 UTC m=+2490.714621027" watchObservedRunningTime="2026-03-13 12:28:23.439795413 +0000 UTC m=+2490.719448860" Mar 13 12:28:24 crc kubenswrapper[4786]: I0313 12:28:24.395725 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:24 crc kubenswrapper[4786]: I0313 12:28:24.395811 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:24 crc kubenswrapper[4786]: I0313 12:28:24.440471 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:28:24 crc kubenswrapper[4786]: E0313 12:28:24.440755 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:28:25 crc kubenswrapper[4786]: I0313 12:28:25.446415 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kbg65" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="registry-server" probeResult="failure" output=< Mar 13 12:28:25 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:28:25 crc kubenswrapper[4786]: > Mar 13 12:28:29 crc kubenswrapper[4786]: I0313 12:28:29.163597 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:29 crc kubenswrapper[4786]: I0313 12:28:29.163982 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:29 crc kubenswrapper[4786]: I0313 12:28:29.209371 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:29 crc kubenswrapper[4786]: I0313 12:28:29.495994 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:31 crc kubenswrapper[4786]: I0313 12:28:31.387501 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kxn7"] Mar 13 12:28:31 crc kubenswrapper[4786]: I0313 12:28:31.465534 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6kxn7" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="registry-server" containerID="cri-o://d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b" gracePeriod=2 Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.139148 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.205460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-catalog-content\") pod \"bd71c645-05aa-4637-a976-0575dca0b8b6\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.205539 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-utilities\") pod \"bd71c645-05aa-4637-a976-0575dca0b8b6\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.205667 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gkj7\" (UniqueName: \"kubernetes.io/projected/bd71c645-05aa-4637-a976-0575dca0b8b6-kube-api-access-5gkj7\") pod \"bd71c645-05aa-4637-a976-0575dca0b8b6\" (UID: \"bd71c645-05aa-4637-a976-0575dca0b8b6\") " Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.207491 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-utilities" (OuterVolumeSpecName: "utilities") pod "bd71c645-05aa-4637-a976-0575dca0b8b6" (UID: "bd71c645-05aa-4637-a976-0575dca0b8b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.212719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd71c645-05aa-4637-a976-0575dca0b8b6-kube-api-access-5gkj7" (OuterVolumeSpecName: "kube-api-access-5gkj7") pod "bd71c645-05aa-4637-a976-0575dca0b8b6" (UID: "bd71c645-05aa-4637-a976-0575dca0b8b6"). InnerVolumeSpecName "kube-api-access-5gkj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.307799 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.307837 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gkj7\" (UniqueName: \"kubernetes.io/projected/bd71c645-05aa-4637-a976-0575dca0b8b6-kube-api-access-5gkj7\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.437344 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd71c645-05aa-4637-a976-0575dca0b8b6" (UID: "bd71c645-05aa-4637-a976-0575dca0b8b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.477935 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerID="d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b" exitCode=0 Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.478025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerDied","Data":"d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b"} Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.478045 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kxn7" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.478088 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kxn7" event={"ID":"bd71c645-05aa-4637-a976-0575dca0b8b6","Type":"ContainerDied","Data":"018451f1d732ac85f606cb1ec4063c80bd4ed432df1cc3fa96988edb09d2f028"} Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.478114 4786 scope.go:117] "RemoveContainer" containerID="d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.496621 4786 scope.go:117] "RemoveContainer" containerID="7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.513296 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd71c645-05aa-4637-a976-0575dca0b8b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.516526 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kxn7"] Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.521255 4786 scope.go:117] "RemoveContainer" containerID="6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.521732 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6kxn7"] Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.550200 4786 scope.go:117] "RemoveContainer" containerID="d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b" Mar 13 12:28:32 crc kubenswrapper[4786]: E0313 12:28:32.550661 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b\": container with ID starting with d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b not found: ID does not exist" containerID="d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.550709 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b"} err="failed to get container status \"d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b\": rpc error: code = NotFound desc = could not find container \"d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b\": container with ID starting with d8bf3afa36c1920f4390c9d1446a60f3ade1394796edfdbffe80a026049fee7b not found: ID does not exist" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.550740 4786 scope.go:117] "RemoveContainer" containerID="7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc" Mar 13 12:28:32 crc kubenswrapper[4786]: E0313 12:28:32.551172 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc\": container with ID starting with 7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc not found: ID does not exist" containerID="7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.551235 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc"} err="failed to get container status \"7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc\": rpc error: code = NotFound desc = could not find container \"7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc\": container with ID starting with 7ced3d9b43ab84e45de89d5b010f4e6d5cfdda7149a3b7bb321f0e16874ed5cc not found: ID does not exist" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.551263 4786 scope.go:117] "RemoveContainer" containerID="6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043" Mar 13 12:28:32 crc kubenswrapper[4786]: E0313 12:28:32.551721 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043\": container with ID starting with 6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043 not found: ID does not exist" containerID="6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043" Mar 13 12:28:32 crc kubenswrapper[4786]: I0313 12:28:32.551753 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043"} err="failed to get container status \"6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043\": rpc error: code = NotFound desc = could not find container \"6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043\": container with ID starting with 6f9a273108efd23a634257f9d2f4a10ac3391f09808584afdc7ea600e88aa043 not found: ID does not exist" Mar 13 12:28:33 crc kubenswrapper[4786]: I0313 12:28:33.449333 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" path="/var/lib/kubelet/pods/bd71c645-05aa-4637-a976-0575dca0b8b6/volumes" Mar 13 12:28:34 crc kubenswrapper[4786]: I0313 12:28:34.452298 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:34 crc kubenswrapper[4786]: I0313 12:28:34.506181 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:35 crc kubenswrapper[4786]: I0313 12:28:35.440129 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:28:35 crc kubenswrapper[4786]: E0313 12:28:35.440475 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:28:35 crc kubenswrapper[4786]: I0313 12:28:35.586768 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbg65"] Mar 13 12:28:35 crc kubenswrapper[4786]: I0313 12:28:35.587013 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kbg65" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="registry-server" containerID="cri-o://d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5" gracePeriod=2 Mar 13 12:28:35 crc kubenswrapper[4786]: I0313 12:28:35.958607 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.064478 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-utilities\") pod \"d871e656-70a1-40d8-b203-e2344a6b61db\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.064615 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-catalog-content\") pod \"d871e656-70a1-40d8-b203-e2344a6b61db\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.064654 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj7lh\" (UniqueName: \"kubernetes.io/projected/d871e656-70a1-40d8-b203-e2344a6b61db-kube-api-access-gj7lh\") pod \"d871e656-70a1-40d8-b203-e2344a6b61db\" (UID: \"d871e656-70a1-40d8-b203-e2344a6b61db\") " Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.065579 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-utilities" (OuterVolumeSpecName: "utilities") pod "d871e656-70a1-40d8-b203-e2344a6b61db" (UID: "d871e656-70a1-40d8-b203-e2344a6b61db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.070118 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d871e656-70a1-40d8-b203-e2344a6b61db-kube-api-access-gj7lh" (OuterVolumeSpecName: "kube-api-access-gj7lh") pod "d871e656-70a1-40d8-b203-e2344a6b61db" (UID: "d871e656-70a1-40d8-b203-e2344a6b61db"). InnerVolumeSpecName "kube-api-access-gj7lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.166395 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.166436 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj7lh\" (UniqueName: \"kubernetes.io/projected/d871e656-70a1-40d8-b203-e2344a6b61db-kube-api-access-gj7lh\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.216639 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d871e656-70a1-40d8-b203-e2344a6b61db" (UID: "d871e656-70a1-40d8-b203-e2344a6b61db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.267483 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d871e656-70a1-40d8-b203-e2344a6b61db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.510479 4786 generic.go:334] "Generic (PLEG): container finished" podID="d871e656-70a1-40d8-b203-e2344a6b61db" containerID="d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5" exitCode=0 Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.510521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerDied","Data":"d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5"} Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.510549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbg65" event={"ID":"d871e656-70a1-40d8-b203-e2344a6b61db","Type":"ContainerDied","Data":"9752912d40d2927d7fd86e888b9ee9800fcec7dc06c9aaed98c291c60c0050b9"} Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.510565 4786 scope.go:117] "RemoveContainer" containerID="d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.510570 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbg65" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.525237 4786 scope.go:117] "RemoveContainer" containerID="4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.545624 4786 scope.go:117] "RemoveContainer" containerID="c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.566965 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbg65"] Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.573372 4786 scope.go:117] "RemoveContainer" containerID="d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5" Mar 13 12:28:36 crc kubenswrapper[4786]: E0313 12:28:36.573742 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5\": container with ID starting with d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5 not found: ID does not exist" containerID="d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.573775 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5"} err="failed to get container status \"d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5\": rpc error: code = NotFound desc = could not find container \"d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5\": container with ID starting with d9a73801532466fed53199a6a3683201f170e383ab9d1e8e4fcbb0e9f629fbc5 not found: ID does not exist" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.573797 4786 scope.go:117] "RemoveContainer" containerID="4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa" Mar 13 12:28:36 crc kubenswrapper[4786]: E0313 12:28:36.574022 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa\": container with ID starting with 4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa not found: ID does not exist" containerID="4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.574046 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa"} err="failed to get container status \"4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa\": rpc error: code = NotFound desc = could not find container \"4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa\": container with ID starting with 4fcc8b68b4be0a73cbf6db404da4c5fe7d88794a1240d7794ce36d2e439bd0fa not found: ID does not exist" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.574059 4786 scope.go:117] "RemoveContainer" containerID="c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.574162 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kbg65"] Mar 13 12:28:36 crc kubenswrapper[4786]: E0313 12:28:36.574508 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a\": container with ID starting with c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a not found: ID does not exist" containerID="c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a" Mar 13 12:28:36 crc kubenswrapper[4786]: I0313 12:28:36.574643 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a"} err="failed to get container status \"c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a\": rpc error: code = NotFound desc = could not find container \"c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a\": container with ID starting with c27e80b8d114d4b542dbf3c5554b20457caa1ed4f3f4e336f3be940a0981cf5a not found: ID does not exist" Mar 13 12:28:37 crc kubenswrapper[4786]: I0313 12:28:37.450787 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" path="/var/lib/kubelet/pods/d871e656-70a1-40d8-b203-e2344a6b61db/volumes" Mar 13 12:28:49 crc kubenswrapper[4786]: I0313 12:28:49.440746 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:28:49 crc kubenswrapper[4786]: E0313 12:28:49.441587 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:29:03 crc kubenswrapper[4786]: I0313 12:29:03.445343 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:29:03 crc kubenswrapper[4786]: E0313 12:29:03.446829 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:29:17 crc kubenswrapper[4786]: I0313 12:29:17.441542 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:29:17 crc kubenswrapper[4786]: E0313 12:29:17.442769 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:29:32 crc kubenswrapper[4786]: I0313 12:29:32.440672 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:29:32 crc kubenswrapper[4786]: E0313 12:29:32.441564 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:29:44 crc kubenswrapper[4786]: I0313 12:29:44.440322 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:29:44 crc kubenswrapper[4786]: E0313 12:29:44.441030 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:29:59 crc kubenswrapper[4786]: I0313 12:29:59.441111 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:29:59 crc kubenswrapper[4786]: E0313 12:29:59.442175 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156270 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556750-p4nw4"] Mar 13 12:30:00 crc kubenswrapper[4786]: E0313 12:30:00.156763 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="registry-server" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156775 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="registry-server" Mar 13 12:30:00 crc kubenswrapper[4786]: E0313 12:30:00.156785 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="extract-content" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156793 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="extract-content" Mar 13 12:30:00 crc kubenswrapper[4786]: E0313 12:30:00.156805 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="extract-content" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156810 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="extract-content" Mar 13 12:30:00 crc kubenswrapper[4786]: E0313 12:30:00.156822 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="extract-utilities" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156829 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="extract-utilities" Mar 13 12:30:00 crc kubenswrapper[4786]: E0313 12:30:00.156841 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="extract-utilities" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156848 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="extract-utilities" Mar 13 12:30:00 crc kubenswrapper[4786]: E0313 12:30:00.156856 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="registry-server" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.156862 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="registry-server" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.157002 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd71c645-05aa-4637-a976-0575dca0b8b6" containerName="registry-server" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.157018 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d871e656-70a1-40d8-b203-e2344a6b61db" containerName="registry-server" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.157456 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.163425 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.163456 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.175286 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.180259 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w"] Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.181569 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.182924 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.182923 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.187635 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-p4nw4"] Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.193994 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w"] Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.301457 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvvpn\" (UniqueName: \"kubernetes.io/projected/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-kube-api-access-xvvpn\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.301517 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9mbf\" (UniqueName: \"kubernetes.io/projected/6ef9a654-762c-4834-a1d9-3eeca80a0561-kube-api-access-m9mbf\") pod \"auto-csr-approver-29556750-p4nw4\" (UID: \"6ef9a654-762c-4834-a1d9-3eeca80a0561\") " pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.301605 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-secret-volume\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.301667 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-config-volume\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.402774 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9mbf\" (UniqueName: \"kubernetes.io/projected/6ef9a654-762c-4834-a1d9-3eeca80a0561-kube-api-access-m9mbf\") pod \"auto-csr-approver-29556750-p4nw4\" (UID: \"6ef9a654-762c-4834-a1d9-3eeca80a0561\") " pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.402824 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-secret-volume\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.402852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-config-volume\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.402954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvvpn\" (UniqueName: \"kubernetes.io/projected/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-kube-api-access-xvvpn\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.404095 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-config-volume\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.409993 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-secret-volume\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.419028 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9mbf\" (UniqueName: \"kubernetes.io/projected/6ef9a654-762c-4834-a1d9-3eeca80a0561-kube-api-access-m9mbf\") pod \"auto-csr-approver-29556750-p4nw4\" (UID: \"6ef9a654-762c-4834-a1d9-3eeca80a0561\") " pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.421845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvvpn\" (UniqueName: \"kubernetes.io/projected/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-kube-api-access-xvvpn\") pod \"collect-profiles-29556750-p949w\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.522155 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.542345 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.823281 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w"] Mar 13 12:30:00 crc kubenswrapper[4786]: W0313 12:30:00.828247 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fcdaec4_eb84_4831_936e_56b3fdb0ef96.slice/crio-99c12229eaf383b58c08fc62d47b9532c9aa1d084ba99e4eb25f7e34c7526a21 WatchSource:0}: Error finding container 99c12229eaf383b58c08fc62d47b9532c9aa1d084ba99e4eb25f7e34c7526a21: Status 404 returned error can't find the container with id 99c12229eaf383b58c08fc62d47b9532c9aa1d084ba99e4eb25f7e34c7526a21 Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.954710 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-p4nw4"] Mar 13 12:30:00 crc kubenswrapper[4786]: W0313 12:30:00.958745 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef9a654_762c_4834_a1d9_3eeca80a0561.slice/crio-a77c933fcc29a7de4009a5d540d37e2f2a34f97021aa9d0ca4bafd3a29856486 WatchSource:0}: Error finding container a77c933fcc29a7de4009a5d540d37e2f2a34f97021aa9d0ca4bafd3a29856486: Status 404 returned error can't find the container with id a77c933fcc29a7de4009a5d540d37e2f2a34f97021aa9d0ca4bafd3a29856486 Mar 13 12:30:00 crc kubenswrapper[4786]: I0313 12:30:00.961481 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:30:01 crc kubenswrapper[4786]: I0313 12:30:01.145575 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" event={"ID":"6ef9a654-762c-4834-a1d9-3eeca80a0561","Type":"ContainerStarted","Data":"a77c933fcc29a7de4009a5d540d37e2f2a34f97021aa9d0ca4bafd3a29856486"} Mar 13 12:30:01 crc kubenswrapper[4786]: I0313 12:30:01.147016 4786 generic.go:334] "Generic (PLEG): container finished" podID="9fcdaec4-eb84-4831-936e-56b3fdb0ef96" containerID="8cbdee797d5efec39e57b237318c3eaa481efc4e869754af7ce23e07a173aa78" exitCode=0 Mar 13 12:30:01 crc kubenswrapper[4786]: I0313 12:30:01.147043 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" event={"ID":"9fcdaec4-eb84-4831-936e-56b3fdb0ef96","Type":"ContainerDied","Data":"8cbdee797d5efec39e57b237318c3eaa481efc4e869754af7ce23e07a173aa78"} Mar 13 12:30:01 crc kubenswrapper[4786]: I0313 12:30:01.147058 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" event={"ID":"9fcdaec4-eb84-4831-936e-56b3fdb0ef96","Type":"ContainerStarted","Data":"99c12229eaf383b58c08fc62d47b9532c9aa1d084ba99e4eb25f7e34c7526a21"} Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.400714 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.545829 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-config-volume\") pod \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.545962 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-secret-volume\") pod \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.546088 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvvpn\" (UniqueName: \"kubernetes.io/projected/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-kube-api-access-xvvpn\") pod \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\" (UID: \"9fcdaec4-eb84-4831-936e-56b3fdb0ef96\") " Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.546606 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-config-volume" (OuterVolumeSpecName: "config-volume") pod "9fcdaec4-eb84-4831-936e-56b3fdb0ef96" (UID: "9fcdaec4-eb84-4831-936e-56b3fdb0ef96"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.551631 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9fcdaec4-eb84-4831-936e-56b3fdb0ef96" (UID: "9fcdaec4-eb84-4831-936e-56b3fdb0ef96"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.556193 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-kube-api-access-xvvpn" (OuterVolumeSpecName: "kube-api-access-xvvpn") pod "9fcdaec4-eb84-4831-936e-56b3fdb0ef96" (UID: "9fcdaec4-eb84-4831-936e-56b3fdb0ef96"). InnerVolumeSpecName "kube-api-access-xvvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.648759 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.648850 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvvpn\" (UniqueName: \"kubernetes.io/projected/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-kube-api-access-xvvpn\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:02 crc kubenswrapper[4786]: I0313 12:30:02.648863 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fcdaec4-eb84-4831-936e-56b3fdb0ef96-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.163170 4786 generic.go:334] "Generic (PLEG): container finished" podID="6ef9a654-762c-4834-a1d9-3eeca80a0561" containerID="e2978cb814ccc80a10454ef963158a500655eeb4df0cf202b273f97926b66098" exitCode=0 Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.163240 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" event={"ID":"6ef9a654-762c-4834-a1d9-3eeca80a0561","Type":"ContainerDied","Data":"e2978cb814ccc80a10454ef963158a500655eeb4df0cf202b273f97926b66098"} Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.164714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" event={"ID":"9fcdaec4-eb84-4831-936e-56b3fdb0ef96","Type":"ContainerDied","Data":"99c12229eaf383b58c08fc62d47b9532c9aa1d084ba99e4eb25f7e34c7526a21"} Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.164745 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99c12229eaf383b58c08fc62d47b9532c9aa1d084ba99e4eb25f7e34c7526a21" Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.164760 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-p949w" Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.479048 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf"] Mar 13 12:30:03 crc kubenswrapper[4786]: I0313 12:30:03.484252 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-gkzpf"] Mar 13 12:30:04 crc kubenswrapper[4786]: I0313 12:30:04.415201 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:04 crc kubenswrapper[4786]: I0313 12:30:04.573961 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9mbf\" (UniqueName: \"kubernetes.io/projected/6ef9a654-762c-4834-a1d9-3eeca80a0561-kube-api-access-m9mbf\") pod \"6ef9a654-762c-4834-a1d9-3eeca80a0561\" (UID: \"6ef9a654-762c-4834-a1d9-3eeca80a0561\") " Mar 13 12:30:04 crc kubenswrapper[4786]: I0313 12:30:04.578542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef9a654-762c-4834-a1d9-3eeca80a0561-kube-api-access-m9mbf" (OuterVolumeSpecName: "kube-api-access-m9mbf") pod "6ef9a654-762c-4834-a1d9-3eeca80a0561" (UID: "6ef9a654-762c-4834-a1d9-3eeca80a0561"). InnerVolumeSpecName "kube-api-access-m9mbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:04 crc kubenswrapper[4786]: I0313 12:30:04.676029 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9mbf\" (UniqueName: \"kubernetes.io/projected/6ef9a654-762c-4834-a1d9-3eeca80a0561-kube-api-access-m9mbf\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:05 crc kubenswrapper[4786]: I0313 12:30:05.177310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" event={"ID":"6ef9a654-762c-4834-a1d9-3eeca80a0561","Type":"ContainerDied","Data":"a77c933fcc29a7de4009a5d540d37e2f2a34f97021aa9d0ca4bafd3a29856486"} Mar 13 12:30:05 crc kubenswrapper[4786]: I0313 12:30:05.177669 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77c933fcc29a7de4009a5d540d37e2f2a34f97021aa9d0ca4bafd3a29856486" Mar 13 12:30:05 crc kubenswrapper[4786]: I0313 12:30:05.177360 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-p4nw4" Mar 13 12:30:05 crc kubenswrapper[4786]: I0313 12:30:05.464020 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4675c625-0d2c-4358-9241-627d96dcb2f0" path="/var/lib/kubelet/pods/4675c625-0d2c-4358-9241-627d96dcb2f0/volumes" Mar 13 12:30:05 crc kubenswrapper[4786]: I0313 12:30:05.474495 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-c6l45"] Mar 13 12:30:05 crc kubenswrapper[4786]: I0313 12:30:05.480628 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-c6l45"] Mar 13 12:30:07 crc kubenswrapper[4786]: I0313 12:30:07.450221 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6c7377-4631-4d28-9983-7a43fbee8c3e" path="/var/lib/kubelet/pods/4d6c7377-4631-4d28-9983-7a43fbee8c3e/volumes" Mar 13 12:30:10 crc kubenswrapper[4786]: I0313 12:30:10.441596 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:30:10 crc kubenswrapper[4786]: E0313 12:30:10.442388 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:30:17 crc kubenswrapper[4786]: I0313 12:30:17.254736 4786 scope.go:117] "RemoveContainer" containerID="56c05258f4546d6e676a68dfb301418f6e7f8fcce3cd8dcf8438a2b735e7eb25" Mar 13 12:30:17 crc kubenswrapper[4786]: I0313 12:30:17.287700 4786 scope.go:117] "RemoveContainer" containerID="1516b8433a9509bf7e7a6a00f912f6911ce3c22a6cf5cb2bec40001954dc9889" Mar 13 12:30:24 crc kubenswrapper[4786]: I0313 12:30:24.441518 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:30:24 crc kubenswrapper[4786]: E0313 12:30:24.442611 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:30:37 crc kubenswrapper[4786]: I0313 12:30:37.440568 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:30:37 crc kubenswrapper[4786]: E0313 12:30:37.441234 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:30:52 crc kubenswrapper[4786]: I0313 12:30:52.441581 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:30:52 crc kubenswrapper[4786]: E0313 12:30:52.442552 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:31:03 crc kubenswrapper[4786]: I0313 12:31:03.445222 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:31:03 crc kubenswrapper[4786]: E0313 12:31:03.445964 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:31:15 crc kubenswrapper[4786]: I0313 12:31:15.441279 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:31:15 crc kubenswrapper[4786]: E0313 12:31:15.442469 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:31:28 crc kubenswrapper[4786]: I0313 12:31:28.440688 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:31:28 crc kubenswrapper[4786]: E0313 12:31:28.441546 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:31:40 crc kubenswrapper[4786]: I0313 12:31:40.441248 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:31:40 crc kubenswrapper[4786]: E0313 12:31:40.442423 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:31:51 crc kubenswrapper[4786]: I0313 12:31:51.440985 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:31:51 crc kubenswrapper[4786]: E0313 12:31:51.441840 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.150862 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556752-fvpbq"] Mar 13 12:32:00 crc kubenswrapper[4786]: E0313 12:32:00.151799 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef9a654-762c-4834-a1d9-3eeca80a0561" containerName="oc" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.151816 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef9a654-762c-4834-a1d9-3eeca80a0561" containerName="oc" Mar 13 12:32:00 crc kubenswrapper[4786]: E0313 12:32:00.151851 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcdaec4-eb84-4831-936e-56b3fdb0ef96" containerName="collect-profiles" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.151857 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcdaec4-eb84-4831-936e-56b3fdb0ef96" containerName="collect-profiles" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.152006 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcdaec4-eb84-4831-936e-56b3fdb0ef96" containerName="collect-profiles" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.152028 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef9a654-762c-4834-a1d9-3eeca80a0561" containerName="oc" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.152471 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.154875 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.158631 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.158949 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.164851 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-fvpbq"] Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.205263 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7q9q\" (UniqueName: \"kubernetes.io/projected/b5aa643d-cff3-4059-aaaa-fe62250d0601-kube-api-access-b7q9q\") pod \"auto-csr-approver-29556752-fvpbq\" (UID: \"b5aa643d-cff3-4059-aaaa-fe62250d0601\") " pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.307466 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7q9q\" (UniqueName: \"kubernetes.io/projected/b5aa643d-cff3-4059-aaaa-fe62250d0601-kube-api-access-b7q9q\") pod \"auto-csr-approver-29556752-fvpbq\" (UID: \"b5aa643d-cff3-4059-aaaa-fe62250d0601\") " pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.328490 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7q9q\" (UniqueName: \"kubernetes.io/projected/b5aa643d-cff3-4059-aaaa-fe62250d0601-kube-api-access-b7q9q\") pod \"auto-csr-approver-29556752-fvpbq\" (UID: \"b5aa643d-cff3-4059-aaaa-fe62250d0601\") " pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.485831 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:00 crc kubenswrapper[4786]: I0313 12:32:00.717557 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-fvpbq"] Mar 13 12:32:01 crc kubenswrapper[4786]: I0313 12:32:01.072276 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" event={"ID":"b5aa643d-cff3-4059-aaaa-fe62250d0601","Type":"ContainerStarted","Data":"92d3924239d8235291afbd49d56e6249787e66b357be1adad4aeb1bfb87f5132"} Mar 13 12:32:02 crc kubenswrapper[4786]: I0313 12:32:02.081628 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5aa643d-cff3-4059-aaaa-fe62250d0601" containerID="d0944c2d6de9ad3c6e5a72672f44a49321a24626918d7b367a9edc3af73b49a5" exitCode=0 Mar 13 12:32:02 crc kubenswrapper[4786]: I0313 12:32:02.081728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" event={"ID":"b5aa643d-cff3-4059-aaaa-fe62250d0601","Type":"ContainerDied","Data":"d0944c2d6de9ad3c6e5a72672f44a49321a24626918d7b367a9edc3af73b49a5"} Mar 13 12:32:03 crc kubenswrapper[4786]: I0313 12:32:03.415071 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:03 crc kubenswrapper[4786]: I0313 12:32:03.551445 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7q9q\" (UniqueName: \"kubernetes.io/projected/b5aa643d-cff3-4059-aaaa-fe62250d0601-kube-api-access-b7q9q\") pod \"b5aa643d-cff3-4059-aaaa-fe62250d0601\" (UID: \"b5aa643d-cff3-4059-aaaa-fe62250d0601\") " Mar 13 12:32:03 crc kubenswrapper[4786]: I0313 12:32:03.560251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5aa643d-cff3-4059-aaaa-fe62250d0601-kube-api-access-b7q9q" (OuterVolumeSpecName: "kube-api-access-b7q9q") pod "b5aa643d-cff3-4059-aaaa-fe62250d0601" (UID: "b5aa643d-cff3-4059-aaaa-fe62250d0601"). InnerVolumeSpecName "kube-api-access-b7q9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:32:03 crc kubenswrapper[4786]: I0313 12:32:03.653603 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7q9q\" (UniqueName: \"kubernetes.io/projected/b5aa643d-cff3-4059-aaaa-fe62250d0601-kube-api-access-b7q9q\") on node \"crc\" DevicePath \"\"" Mar 13 12:32:04 crc kubenswrapper[4786]: I0313 12:32:04.101149 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" event={"ID":"b5aa643d-cff3-4059-aaaa-fe62250d0601","Type":"ContainerDied","Data":"92d3924239d8235291afbd49d56e6249787e66b357be1adad4aeb1bfb87f5132"} Mar 13 12:32:04 crc kubenswrapper[4786]: I0313 12:32:04.101200 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d3924239d8235291afbd49d56e6249787e66b357be1adad4aeb1bfb87f5132" Mar 13 12:32:04 crc kubenswrapper[4786]: I0313 12:32:04.101282 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-fvpbq" Mar 13 12:32:04 crc kubenswrapper[4786]: I0313 12:32:04.442945 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:32:04 crc kubenswrapper[4786]: E0313 12:32:04.443353 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:32:04 crc kubenswrapper[4786]: I0313 12:32:04.480266 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-s7xvx"] Mar 13 12:32:04 crc kubenswrapper[4786]: I0313 12:32:04.485299 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-s7xvx"] Mar 13 12:32:05 crc kubenswrapper[4786]: I0313 12:32:05.458439 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703" path="/var/lib/kubelet/pods/a7cfe23d-bb0e-4ded-84d6-fe9c0ec91703/volumes" Mar 13 12:32:17 crc kubenswrapper[4786]: I0313 12:32:17.386002 4786 scope.go:117] "RemoveContainer" containerID="a019deac1eb439493d89c5e62810f2215beac2152d1faa2a84ce9410562f30f5" Mar 13 12:32:17 crc kubenswrapper[4786]: I0313 12:32:17.440217 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:32:18 crc kubenswrapper[4786]: I0313 12:32:18.211761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"6cfddb721df23cfcce85ce6f479e1e0451a74c6a861ffb572b373f22dcfb3ad1"} Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.148759 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556754-qwk5s"] Mar 13 12:34:00 crc kubenswrapper[4786]: E0313 12:34:00.149534 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5aa643d-cff3-4059-aaaa-fe62250d0601" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.149546 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5aa643d-cff3-4059-aaaa-fe62250d0601" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.149691 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5aa643d-cff3-4059-aaaa-fe62250d0601" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.150139 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.153061 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.153638 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.153642 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.167080 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-qwk5s"] Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.197791 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9pp\" (UniqueName: \"kubernetes.io/projected/c5fce5a6-e739-4860-9e11-c874a6f2d232-kube-api-access-kl9pp\") pod \"auto-csr-approver-29556754-qwk5s\" (UID: \"c5fce5a6-e739-4860-9e11-c874a6f2d232\") " pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.299028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9pp\" (UniqueName: \"kubernetes.io/projected/c5fce5a6-e739-4860-9e11-c874a6f2d232-kube-api-access-kl9pp\") pod \"auto-csr-approver-29556754-qwk5s\" (UID: \"c5fce5a6-e739-4860-9e11-c874a6f2d232\") " pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.332960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9pp\" (UniqueName: \"kubernetes.io/projected/c5fce5a6-e739-4860-9e11-c874a6f2d232-kube-api-access-kl9pp\") pod \"auto-csr-approver-29556754-qwk5s\" (UID: \"c5fce5a6-e739-4860-9e11-c874a6f2d232\") " pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.478693 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:00 crc kubenswrapper[4786]: I0313 12:34:00.893452 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-qwk5s"] Mar 13 12:34:01 crc kubenswrapper[4786]: I0313 12:34:01.002785 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" event={"ID":"c5fce5a6-e739-4860-9e11-c874a6f2d232","Type":"ContainerStarted","Data":"82907d5d3c3a6a45f91e98c0f7f454824c5b4bb08a83841113cbf9977caa841d"} Mar 13 12:34:03 crc kubenswrapper[4786]: I0313 12:34:03.049716 4786 generic.go:334] "Generic (PLEG): container finished" podID="c5fce5a6-e739-4860-9e11-c874a6f2d232" containerID="56051ec14e93ef5fa22ac947c2f78edaace8b4d527da3805e2f6103cf218d7e1" exitCode=0 Mar 13 12:34:03 crc kubenswrapper[4786]: I0313 12:34:03.049831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" event={"ID":"c5fce5a6-e739-4860-9e11-c874a6f2d232","Type":"ContainerDied","Data":"56051ec14e93ef5fa22ac947c2f78edaace8b4d527da3805e2f6103cf218d7e1"} Mar 13 12:34:04 crc kubenswrapper[4786]: I0313 12:34:04.331954 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:04 crc kubenswrapper[4786]: I0313 12:34:04.461498 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl9pp\" (UniqueName: \"kubernetes.io/projected/c5fce5a6-e739-4860-9e11-c874a6f2d232-kube-api-access-kl9pp\") pod \"c5fce5a6-e739-4860-9e11-c874a6f2d232\" (UID: \"c5fce5a6-e739-4860-9e11-c874a6f2d232\") " Mar 13 12:34:04 crc kubenswrapper[4786]: I0313 12:34:04.469211 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fce5a6-e739-4860-9e11-c874a6f2d232-kube-api-access-kl9pp" (OuterVolumeSpecName: "kube-api-access-kl9pp") pod "c5fce5a6-e739-4860-9e11-c874a6f2d232" (UID: "c5fce5a6-e739-4860-9e11-c874a6f2d232"). InnerVolumeSpecName "kube-api-access-kl9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:34:04 crc kubenswrapper[4786]: I0313 12:34:04.563858 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl9pp\" (UniqueName: \"kubernetes.io/projected/c5fce5a6-e739-4860-9e11-c874a6f2d232-kube-api-access-kl9pp\") on node \"crc\" DevicePath \"\"" Mar 13 12:34:05 crc kubenswrapper[4786]: I0313 12:34:05.070638 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" event={"ID":"c5fce5a6-e739-4860-9e11-c874a6f2d232","Type":"ContainerDied","Data":"82907d5d3c3a6a45f91e98c0f7f454824c5b4bb08a83841113cbf9977caa841d"} Mar 13 12:34:05 crc kubenswrapper[4786]: I0313 12:34:05.070990 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82907d5d3c3a6a45f91e98c0f7f454824c5b4bb08a83841113cbf9977caa841d" Mar 13 12:34:05 crc kubenswrapper[4786]: I0313 12:34:05.070694 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-qwk5s" Mar 13 12:34:05 crc kubenswrapper[4786]: I0313 12:34:05.397291 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-4qlwm"] Mar 13 12:34:05 crc kubenswrapper[4786]: I0313 12:34:05.402573 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-4qlwm"] Mar 13 12:34:05 crc kubenswrapper[4786]: I0313 12:34:05.450171 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80fd49ea-8f19-4ffa-9ef5-298c34c4dea6" path="/var/lib/kubelet/pods/80fd49ea-8f19-4ffa-9ef5-298c34c4dea6/volumes" Mar 13 12:34:17 crc kubenswrapper[4786]: I0313 12:34:17.477586 4786 scope.go:117] "RemoveContainer" containerID="93d1dbd4768d90ec32952728df37983b9dfa60a30a38c4ec7e7389b29913417f" Mar 13 12:34:38 crc kubenswrapper[4786]: I0313 12:34:38.169027 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:34:38 crc kubenswrapper[4786]: I0313 12:34:38.169629 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:35:08 crc kubenswrapper[4786]: I0313 12:35:08.169723 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:35:08 crc kubenswrapper[4786]: I0313 12:35:08.170272 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.169655 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.170305 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.170372 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.171282 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cfddb721df23cfcce85ce6f479e1e0451a74c6a861ffb572b373f22dcfb3ad1"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.171388 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://6cfddb721df23cfcce85ce6f479e1e0451a74c6a861ffb572b373f22dcfb3ad1" gracePeriod=600 Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.739764 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="6cfddb721df23cfcce85ce6f479e1e0451a74c6a861ffb572b373f22dcfb3ad1" exitCode=0 Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.739843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"6cfddb721df23cfcce85ce6f479e1e0451a74c6a861ffb572b373f22dcfb3ad1"} Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.740127 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086"} Mar 13 12:35:38 crc kubenswrapper[4786]: I0313 12:35:38.740154 4786 scope.go:117] "RemoveContainer" containerID="c5b87cdb8bd331e2263ab54d9055d51b69cd1ec857b863f853e1e76cc738373f" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.140916 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556756-rc9m7"] Mar 13 12:36:00 crc kubenswrapper[4786]: E0313 12:36:00.141830 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fce5a6-e739-4860-9e11-c874a6f2d232" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.141850 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fce5a6-e739-4860-9e11-c874a6f2d232" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.142077 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fce5a6-e739-4860-9e11-c874a6f2d232" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.142607 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.145124 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.145703 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.153424 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.156471 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-rc9m7"] Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.332899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8flp\" (UniqueName: \"kubernetes.io/projected/740920fa-04bf-476f-88e9-b8d4abe078df-kube-api-access-d8flp\") pod \"auto-csr-approver-29556756-rc9m7\" (UID: \"740920fa-04bf-476f-88e9-b8d4abe078df\") " pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.434755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8flp\" (UniqueName: \"kubernetes.io/projected/740920fa-04bf-476f-88e9-b8d4abe078df-kube-api-access-d8flp\") pod \"auto-csr-approver-29556756-rc9m7\" (UID: \"740920fa-04bf-476f-88e9-b8d4abe078df\") " pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.463145 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8flp\" (UniqueName: \"kubernetes.io/projected/740920fa-04bf-476f-88e9-b8d4abe078df-kube-api-access-d8flp\") pod \"auto-csr-approver-29556756-rc9m7\" (UID: \"740920fa-04bf-476f-88e9-b8d4abe078df\") " pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:00 crc kubenswrapper[4786]: I0313 12:36:00.762327 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:01 crc kubenswrapper[4786]: I0313 12:36:01.170967 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-rc9m7"] Mar 13 12:36:01 crc kubenswrapper[4786]: I0313 12:36:01.185864 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:36:01 crc kubenswrapper[4786]: I0313 12:36:01.950274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" event={"ID":"740920fa-04bf-476f-88e9-b8d4abe078df","Type":"ContainerStarted","Data":"f25495f2c9febce7b5d2c667fa8412cd4325ca23d69050a8baaa3db1463b47cf"} Mar 13 12:36:02 crc kubenswrapper[4786]: I0313 12:36:02.959128 4786 generic.go:334] "Generic (PLEG): container finished" podID="740920fa-04bf-476f-88e9-b8d4abe078df" containerID="7f0421a07ad07fead3a8af38068edfe7dc9e98a4544eb492a87346f25c483036" exitCode=0 Mar 13 12:36:02 crc kubenswrapper[4786]: I0313 12:36:02.959545 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" event={"ID":"740920fa-04bf-476f-88e9-b8d4abe078df","Type":"ContainerDied","Data":"7f0421a07ad07fead3a8af38068edfe7dc9e98a4544eb492a87346f25c483036"} Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.283618 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.396212 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8flp\" (UniqueName: \"kubernetes.io/projected/740920fa-04bf-476f-88e9-b8d4abe078df-kube-api-access-d8flp\") pod \"740920fa-04bf-476f-88e9-b8d4abe078df\" (UID: \"740920fa-04bf-476f-88e9-b8d4abe078df\") " Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.401385 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740920fa-04bf-476f-88e9-b8d4abe078df-kube-api-access-d8flp" (OuterVolumeSpecName: "kube-api-access-d8flp") pod "740920fa-04bf-476f-88e9-b8d4abe078df" (UID: "740920fa-04bf-476f-88e9-b8d4abe078df"). InnerVolumeSpecName "kube-api-access-d8flp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.498366 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8flp\" (UniqueName: \"kubernetes.io/projected/740920fa-04bf-476f-88e9-b8d4abe078df-kube-api-access-d8flp\") on node \"crc\" DevicePath \"\"" Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.975961 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" event={"ID":"740920fa-04bf-476f-88e9-b8d4abe078df","Type":"ContainerDied","Data":"f25495f2c9febce7b5d2c667fa8412cd4325ca23d69050a8baaa3db1463b47cf"} Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.976007 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f25495f2c9febce7b5d2c667fa8412cd4325ca23d69050a8baaa3db1463b47cf" Mar 13 12:36:04 crc kubenswrapper[4786]: I0313 12:36:04.976080 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-rc9m7" Mar 13 12:36:05 crc kubenswrapper[4786]: I0313 12:36:05.358472 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-p4nw4"] Mar 13 12:36:05 crc kubenswrapper[4786]: I0313 12:36:05.363453 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-p4nw4"] Mar 13 12:36:05 crc kubenswrapper[4786]: I0313 12:36:05.452283 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef9a654-762c-4834-a1d9-3eeca80a0561" path="/var/lib/kubelet/pods/6ef9a654-762c-4834-a1d9-3eeca80a0561/volumes" Mar 13 12:36:17 crc kubenswrapper[4786]: I0313 12:36:17.554379 4786 scope.go:117] "RemoveContainer" containerID="e2978cb814ccc80a10454ef963158a500655eeb4df0cf202b273f97926b66098" Mar 13 12:37:38 crc kubenswrapper[4786]: I0313 12:37:38.169181 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:37:38 crc kubenswrapper[4786]: I0313 12:37:38.169777 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.608294 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkbg5"] Mar 13 12:37:41 crc kubenswrapper[4786]: E0313 12:37:41.608987 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740920fa-04bf-476f-88e9-b8d4abe078df" containerName="oc" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.608999 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="740920fa-04bf-476f-88e9-b8d4abe078df" containerName="oc" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.609133 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="740920fa-04bf-476f-88e9-b8d4abe078df" containerName="oc" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.610132 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.622311 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkbg5"] Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.766527 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-utilities\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.766596 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-catalog-content\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.766619 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hznfr\" (UniqueName: \"kubernetes.io/projected/c2d78c1a-e901-4674-bef4-f837fbf5d05c-kube-api-access-hznfr\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.868345 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-utilities\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.868415 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-catalog-content\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.868442 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hznfr\" (UniqueName: \"kubernetes.io/projected/c2d78c1a-e901-4674-bef4-f837fbf5d05c-kube-api-access-hznfr\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.868846 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-utilities\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.868945 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-catalog-content\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.896083 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hznfr\" (UniqueName: \"kubernetes.io/projected/c2d78c1a-e901-4674-bef4-f837fbf5d05c-kube-api-access-hznfr\") pod \"redhat-marketplace-mkbg5\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:41 crc kubenswrapper[4786]: I0313 12:37:41.926654 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:42 crc kubenswrapper[4786]: I0313 12:37:42.365986 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkbg5"] Mar 13 12:37:42 crc kubenswrapper[4786]: I0313 12:37:42.850276 4786 generic.go:334] "Generic (PLEG): container finished" podID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerID="02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569" exitCode=0 Mar 13 12:37:42 crc kubenswrapper[4786]: I0313 12:37:42.850319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkbg5" event={"ID":"c2d78c1a-e901-4674-bef4-f837fbf5d05c","Type":"ContainerDied","Data":"02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569"} Mar 13 12:37:42 crc kubenswrapper[4786]: I0313 12:37:42.850572 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkbg5" event={"ID":"c2d78c1a-e901-4674-bef4-f837fbf5d05c","Type":"ContainerStarted","Data":"6f0851eb32f74b7ae894e4adf43fee21cce134b1c2a53e23ef9b75a94e90a9f6"} Mar 13 12:37:44 crc kubenswrapper[4786]: I0313 12:37:44.870657 4786 generic.go:334] "Generic (PLEG): container finished" podID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerID="ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6" exitCode=0 Mar 13 12:37:44 crc kubenswrapper[4786]: I0313 12:37:44.870807 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkbg5" event={"ID":"c2d78c1a-e901-4674-bef4-f837fbf5d05c","Type":"ContainerDied","Data":"ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6"} Mar 13 12:37:45 crc kubenswrapper[4786]: I0313 12:37:45.879163 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkbg5" event={"ID":"c2d78c1a-e901-4674-bef4-f837fbf5d05c","Type":"ContainerStarted","Data":"def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b"} Mar 13 12:37:45 crc kubenswrapper[4786]: I0313 12:37:45.906144 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkbg5" podStartSLOduration=2.424324137 podStartE2EDuration="4.906110887s" podCreationTimestamp="2026-03-13 12:37:41 +0000 UTC" firstStartedPulling="2026-03-13 12:37:42.852010753 +0000 UTC m=+3050.131664200" lastFinishedPulling="2026-03-13 12:37:45.333797503 +0000 UTC m=+3052.613450950" observedRunningTime="2026-03-13 12:37:45.900521352 +0000 UTC m=+3053.180174809" watchObservedRunningTime="2026-03-13 12:37:45.906110887 +0000 UTC m=+3053.185764324" Mar 13 12:37:51 crc kubenswrapper[4786]: I0313 12:37:51.927101 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:51 crc kubenswrapper[4786]: I0313 12:37:51.927704 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:51 crc kubenswrapper[4786]: I0313 12:37:51.970304 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:52 crc kubenswrapper[4786]: I0313 12:37:52.967147 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:53 crc kubenswrapper[4786]: I0313 12:37:53.013556 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkbg5"] Mar 13 12:37:54 crc kubenswrapper[4786]: I0313 12:37:54.938122 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkbg5" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="registry-server" containerID="cri-o://def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b" gracePeriod=2 Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.552848 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.681971 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-utilities\") pod \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.682389 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-catalog-content\") pod \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.682565 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hznfr\" (UniqueName: \"kubernetes.io/projected/c2d78c1a-e901-4674-bef4-f837fbf5d05c-kube-api-access-hznfr\") pod \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\" (UID: \"c2d78c1a-e901-4674-bef4-f837fbf5d05c\") " Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.684022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-utilities" (OuterVolumeSpecName: "utilities") pod "c2d78c1a-e901-4674-bef4-f837fbf5d05c" (UID: "c2d78c1a-e901-4674-bef4-f837fbf5d05c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.693501 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d78c1a-e901-4674-bef4-f837fbf5d05c-kube-api-access-hznfr" (OuterVolumeSpecName: "kube-api-access-hznfr") pod "c2d78c1a-e901-4674-bef4-f837fbf5d05c" (UID: "c2d78c1a-e901-4674-bef4-f837fbf5d05c"). InnerVolumeSpecName "kube-api-access-hznfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.719871 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2d78c1a-e901-4674-bef4-f837fbf5d05c" (UID: "c2d78c1a-e901-4674-bef4-f837fbf5d05c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.784367 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hznfr\" (UniqueName: \"kubernetes.io/projected/c2d78c1a-e901-4674-bef4-f837fbf5d05c-kube-api-access-hznfr\") on node \"crc\" DevicePath \"\"" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.784424 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.784434 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d78c1a-e901-4674-bef4-f837fbf5d05c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.957165 4786 generic.go:334] "Generic (PLEG): container finished" podID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerID="def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b" exitCode=0 Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.957217 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkbg5" event={"ID":"c2d78c1a-e901-4674-bef4-f837fbf5d05c","Type":"ContainerDied","Data":"def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b"} Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.957243 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkbg5" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.957282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkbg5" event={"ID":"c2d78c1a-e901-4674-bef4-f837fbf5d05c","Type":"ContainerDied","Data":"6f0851eb32f74b7ae894e4adf43fee21cce134b1c2a53e23ef9b75a94e90a9f6"} Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.957309 4786 scope.go:117] "RemoveContainer" containerID="def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.973906 4786 scope.go:117] "RemoveContainer" containerID="ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6" Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.991058 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkbg5"] Mar 13 12:37:56 crc kubenswrapper[4786]: I0313 12:37:56.996206 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkbg5"] Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.015086 4786 scope.go:117] "RemoveContainer" containerID="02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.030900 4786 scope.go:117] "RemoveContainer" containerID="def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b" Mar 13 12:37:57 crc kubenswrapper[4786]: E0313 12:37:57.031276 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b\": container with ID starting with def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b not found: ID does not exist" containerID="def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.031317 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b"} err="failed to get container status \"def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b\": rpc error: code = NotFound desc = could not find container \"def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b\": container with ID starting with def44383d3276c63dd9fa2361f89878b440af83f0dc39279c2575a1f71825f0b not found: ID does not exist" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.031343 4786 scope.go:117] "RemoveContainer" containerID="ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6" Mar 13 12:37:57 crc kubenswrapper[4786]: E0313 12:37:57.031633 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6\": container with ID starting with ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6 not found: ID does not exist" containerID="ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.031692 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6"} err="failed to get container status \"ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6\": rpc error: code = NotFound desc = could not find container \"ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6\": container with ID starting with ba0ae2043573c5f01a53731ce5a7a75eb04f5f2b5dd5c5b367cc72b58d90fba6 not found: ID does not exist" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.031723 4786 scope.go:117] "RemoveContainer" containerID="02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569" Mar 13 12:37:57 crc kubenswrapper[4786]: E0313 12:37:57.032027 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569\": container with ID starting with 02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569 not found: ID does not exist" containerID="02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.032060 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569"} err="failed to get container status \"02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569\": rpc error: code = NotFound desc = could not find container \"02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569\": container with ID starting with 02446a55de1c7a888a9af275f4cc8e033f651b626960d2ce366a38ac8d414569 not found: ID does not exist" Mar 13 12:37:57 crc kubenswrapper[4786]: I0313 12:37:57.452988 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" path="/var/lib/kubelet/pods/c2d78c1a-e901-4674-bef4-f837fbf5d05c/volumes" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.157039 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556758-xw6z4"] Mar 13 12:38:00 crc kubenswrapper[4786]: E0313 12:38:00.158035 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="registry-server" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.158056 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="registry-server" Mar 13 12:38:00 crc kubenswrapper[4786]: E0313 12:38:00.158112 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="extract-content" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.158122 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="extract-content" Mar 13 12:38:00 crc kubenswrapper[4786]: E0313 12:38:00.158134 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="extract-utilities" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.158143 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="extract-utilities" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.158355 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d78c1a-e901-4674-bef4-f837fbf5d05c" containerName="registry-server" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.159082 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.161282 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.161347 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.161282 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.164063 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-xw6z4"] Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.234476 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9jr\" (UniqueName: \"kubernetes.io/projected/3a5447ce-6139-4ace-92f0-4b024b5d7cfb-kube-api-access-5l9jr\") pod \"auto-csr-approver-29556758-xw6z4\" (UID: \"3a5447ce-6139-4ace-92f0-4b024b5d7cfb\") " pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.335581 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9jr\" (UniqueName: \"kubernetes.io/projected/3a5447ce-6139-4ace-92f0-4b024b5d7cfb-kube-api-access-5l9jr\") pod \"auto-csr-approver-29556758-xw6z4\" (UID: \"3a5447ce-6139-4ace-92f0-4b024b5d7cfb\") " pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.355117 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9jr\" (UniqueName: \"kubernetes.io/projected/3a5447ce-6139-4ace-92f0-4b024b5d7cfb-kube-api-access-5l9jr\") pod \"auto-csr-approver-29556758-xw6z4\" (UID: \"3a5447ce-6139-4ace-92f0-4b024b5d7cfb\") " pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.474711 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.873235 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-xw6z4"] Mar 13 12:38:00 crc kubenswrapper[4786]: I0313 12:38:00.987030 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" event={"ID":"3a5447ce-6139-4ace-92f0-4b024b5d7cfb","Type":"ContainerStarted","Data":"6571f69881565cfd933b4e8e25eb45fce1e7d57cccb4d165568bd139ce9ad1a2"} Mar 13 12:38:03 crc kubenswrapper[4786]: I0313 12:38:03.008282 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a5447ce-6139-4ace-92f0-4b024b5d7cfb" containerID="bce9864d9ae4a215cd60d7776f0bdc1de3c8bb11a6177e22fe632afc3a25a035" exitCode=0 Mar 13 12:38:03 crc kubenswrapper[4786]: I0313 12:38:03.008335 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" event={"ID":"3a5447ce-6139-4ace-92f0-4b024b5d7cfb","Type":"ContainerDied","Data":"bce9864d9ae4a215cd60d7776f0bdc1de3c8bb11a6177e22fe632afc3a25a035"} Mar 13 12:38:04 crc kubenswrapper[4786]: I0313 12:38:04.300283 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:04 crc kubenswrapper[4786]: I0313 12:38:04.492811 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9jr\" (UniqueName: \"kubernetes.io/projected/3a5447ce-6139-4ace-92f0-4b024b5d7cfb-kube-api-access-5l9jr\") pod \"3a5447ce-6139-4ace-92f0-4b024b5d7cfb\" (UID: \"3a5447ce-6139-4ace-92f0-4b024b5d7cfb\") " Mar 13 12:38:04 crc kubenswrapper[4786]: I0313 12:38:04.497651 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5447ce-6139-4ace-92f0-4b024b5d7cfb-kube-api-access-5l9jr" (OuterVolumeSpecName: "kube-api-access-5l9jr") pod "3a5447ce-6139-4ace-92f0-4b024b5d7cfb" (UID: "3a5447ce-6139-4ace-92f0-4b024b5d7cfb"). InnerVolumeSpecName "kube-api-access-5l9jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:38:04 crc kubenswrapper[4786]: I0313 12:38:04.595260 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9jr\" (UniqueName: \"kubernetes.io/projected/3a5447ce-6139-4ace-92f0-4b024b5d7cfb-kube-api-access-5l9jr\") on node \"crc\" DevicePath \"\"" Mar 13 12:38:05 crc kubenswrapper[4786]: I0313 12:38:05.022461 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" event={"ID":"3a5447ce-6139-4ace-92f0-4b024b5d7cfb","Type":"ContainerDied","Data":"6571f69881565cfd933b4e8e25eb45fce1e7d57cccb4d165568bd139ce9ad1a2"} Mar 13 12:38:05 crc kubenswrapper[4786]: I0313 12:38:05.022499 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6571f69881565cfd933b4e8e25eb45fce1e7d57cccb4d165568bd139ce9ad1a2" Mar 13 12:38:05 crc kubenswrapper[4786]: I0313 12:38:05.022534 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-xw6z4" Mar 13 12:38:05 crc kubenswrapper[4786]: I0313 12:38:05.358958 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-fvpbq"] Mar 13 12:38:05 crc kubenswrapper[4786]: I0313 12:38:05.367250 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-fvpbq"] Mar 13 12:38:05 crc kubenswrapper[4786]: I0313 12:38:05.448574 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5aa643d-cff3-4059-aaaa-fe62250d0601" path="/var/lib/kubelet/pods/b5aa643d-cff3-4059-aaaa-fe62250d0601/volumes" Mar 13 12:38:08 crc kubenswrapper[4786]: I0313 12:38:08.169717 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:38:08 crc kubenswrapper[4786]: I0313 12:38:08.170061 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:38:17 crc kubenswrapper[4786]: I0313 12:38:17.650021 4786 scope.go:117] "RemoveContainer" containerID="d0944c2d6de9ad3c6e5a72672f44a49321a24626918d7b367a9edc3af73b49a5" Mar 13 12:38:38 crc kubenswrapper[4786]: I0313 12:38:38.169735 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:38:38 crc kubenswrapper[4786]: I0313 12:38:38.171020 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:38:38 crc kubenswrapper[4786]: I0313 12:38:38.171145 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:38:38 crc kubenswrapper[4786]: I0313 12:38:38.172447 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:38:38 crc kubenswrapper[4786]: I0313 12:38:38.172536 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" gracePeriod=600 Mar 13 12:38:38 crc kubenswrapper[4786]: E0313 12:38:38.294005 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:38:39 crc kubenswrapper[4786]: I0313 12:38:39.270307 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" exitCode=0 Mar 13 12:38:39 crc kubenswrapper[4786]: I0313 12:38:39.270380 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086"} Mar 13 12:38:39 crc kubenswrapper[4786]: I0313 12:38:39.270443 4786 scope.go:117] "RemoveContainer" containerID="6cfddb721df23cfcce85ce6f479e1e0451a74c6a861ffb572b373f22dcfb3ad1" Mar 13 12:38:39 crc kubenswrapper[4786]: I0313 12:38:39.271276 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:38:39 crc kubenswrapper[4786]: E0313 12:38:39.271799 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.331058 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wq4b9"] Mar 13 12:38:47 crc kubenswrapper[4786]: E0313 12:38:47.332408 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5447ce-6139-4ace-92f0-4b024b5d7cfb" containerName="oc" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.332440 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5447ce-6139-4ace-92f0-4b024b5d7cfb" containerName="oc" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.336330 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5447ce-6139-4ace-92f0-4b024b5d7cfb" containerName="oc" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.338495 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.345175 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq4b9"] Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.486274 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6ws\" (UniqueName: \"kubernetes.io/projected/15e9e0e5-7b66-4d89-86b6-08ac757db27c-kube-api-access-bc6ws\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.486682 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-catalog-content\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.486836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-utilities\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.588528 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-catalog-content\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.588600 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-utilities\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.588728 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6ws\" (UniqueName: \"kubernetes.io/projected/15e9e0e5-7b66-4d89-86b6-08ac757db27c-kube-api-access-bc6ws\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.590254 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-catalog-content\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.590829 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-utilities\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.609565 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6ws\" (UniqueName: \"kubernetes.io/projected/15e9e0e5-7b66-4d89-86b6-08ac757db27c-kube-api-access-bc6ws\") pod \"certified-operators-wq4b9\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:47 crc kubenswrapper[4786]: I0313 12:38:47.693439 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:48 crc kubenswrapper[4786]: I0313 12:38:48.145080 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq4b9"] Mar 13 12:38:48 crc kubenswrapper[4786]: I0313 12:38:48.337770 4786 generic.go:334] "Generic (PLEG): container finished" podID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerID="bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880" exitCode=0 Mar 13 12:38:48 crc kubenswrapper[4786]: I0313 12:38:48.337815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq4b9" event={"ID":"15e9e0e5-7b66-4d89-86b6-08ac757db27c","Type":"ContainerDied","Data":"bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880"} Mar 13 12:38:48 crc kubenswrapper[4786]: I0313 12:38:48.337840 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq4b9" event={"ID":"15e9e0e5-7b66-4d89-86b6-08ac757db27c","Type":"ContainerStarted","Data":"488ab104f30bba01b89b1333572bbc3b5e7e54746babf49e1f6ca94997948390"} Mar 13 12:38:50 crc kubenswrapper[4786]: I0313 12:38:50.352503 4786 generic.go:334] "Generic (PLEG): container finished" podID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerID="7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460" exitCode=0 Mar 13 12:38:50 crc kubenswrapper[4786]: I0313 12:38:50.352581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq4b9" event={"ID":"15e9e0e5-7b66-4d89-86b6-08ac757db27c","Type":"ContainerDied","Data":"7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460"} Mar 13 12:38:51 crc kubenswrapper[4786]: I0313 12:38:51.361699 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq4b9" event={"ID":"15e9e0e5-7b66-4d89-86b6-08ac757db27c","Type":"ContainerStarted","Data":"708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0"} Mar 13 12:38:51 crc kubenswrapper[4786]: I0313 12:38:51.384970 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wq4b9" podStartSLOduration=1.733207683 podStartE2EDuration="4.3849446s" podCreationTimestamp="2026-03-13 12:38:47 +0000 UTC" firstStartedPulling="2026-03-13 12:38:48.33933764 +0000 UTC m=+3115.618991087" lastFinishedPulling="2026-03-13 12:38:50.991074517 +0000 UTC m=+3118.270728004" observedRunningTime="2026-03-13 12:38:51.378734219 +0000 UTC m=+3118.658387676" watchObservedRunningTime="2026-03-13 12:38:51.3849446 +0000 UTC m=+3118.664598047" Mar 13 12:38:53 crc kubenswrapper[4786]: I0313 12:38:53.444707 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:38:53 crc kubenswrapper[4786]: E0313 12:38:53.444895 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:38:57 crc kubenswrapper[4786]: I0313 12:38:57.694230 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:57 crc kubenswrapper[4786]: I0313 12:38:57.694905 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:57 crc kubenswrapper[4786]: I0313 12:38:57.763938 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:58 crc kubenswrapper[4786]: I0313 12:38:58.478668 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:38:58 crc kubenswrapper[4786]: I0313 12:38:58.530367 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq4b9"] Mar 13 12:39:00 crc kubenswrapper[4786]: I0313 12:39:00.435291 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wq4b9" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="registry-server" containerID="cri-o://708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0" gracePeriod=2 Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.415048 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.447772 4786 generic.go:334] "Generic (PLEG): container finished" podID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerID="708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0" exitCode=0 Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.447902 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq4b9" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.458385 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq4b9" event={"ID":"15e9e0e5-7b66-4d89-86b6-08ac757db27c","Type":"ContainerDied","Data":"708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0"} Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.458429 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq4b9" event={"ID":"15e9e0e5-7b66-4d89-86b6-08ac757db27c","Type":"ContainerDied","Data":"488ab104f30bba01b89b1333572bbc3b5e7e54746babf49e1f6ca94997948390"} Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.458474 4786 scope.go:117] "RemoveContainer" containerID="708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.480761 4786 scope.go:117] "RemoveContainer" containerID="7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.499927 4786 scope.go:117] "RemoveContainer" containerID="bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.520670 4786 scope.go:117] "RemoveContainer" containerID="708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0" Mar 13 12:39:01 crc kubenswrapper[4786]: E0313 12:39:01.521202 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0\": container with ID starting with 708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0 not found: ID does not exist" containerID="708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.521234 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0"} err="failed to get container status \"708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0\": rpc error: code = NotFound desc = could not find container \"708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0\": container with ID starting with 708c4f7e733906e9a857730f187e5c289cd1a93b77dc14c1b8f7af74d6311cc0 not found: ID does not exist" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.521256 4786 scope.go:117] "RemoveContainer" containerID="7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460" Mar 13 12:39:01 crc kubenswrapper[4786]: E0313 12:39:01.521572 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460\": container with ID starting with 7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460 not found: ID does not exist" containerID="7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.521594 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460"} err="failed to get container status \"7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460\": rpc error: code = NotFound desc = could not find container \"7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460\": container with ID starting with 7ab9da801f96e24df48a9e9022094718a6d5bd12dc37ebef23fbe962827ca460 not found: ID does not exist" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.521607 4786 scope.go:117] "RemoveContainer" containerID="bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880" Mar 13 12:39:01 crc kubenswrapper[4786]: E0313 12:39:01.521819 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880\": container with ID starting with bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880 not found: ID does not exist" containerID="bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.521843 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880"} err="failed to get container status \"bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880\": rpc error: code = NotFound desc = could not find container \"bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880\": container with ID starting with bd9bbbcf33484b7283ae6dc4afce458bf02c7a0b6f58fbedf0dacede4f51e880 not found: ID does not exist" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.597219 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-catalog-content\") pod \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.597282 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-utilities\") pod \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.597334 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc6ws\" (UniqueName: \"kubernetes.io/projected/15e9e0e5-7b66-4d89-86b6-08ac757db27c-kube-api-access-bc6ws\") pod \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\" (UID: \"15e9e0e5-7b66-4d89-86b6-08ac757db27c\") " Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.598435 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-utilities" (OuterVolumeSpecName: "utilities") pod "15e9e0e5-7b66-4d89-86b6-08ac757db27c" (UID: "15e9e0e5-7b66-4d89-86b6-08ac757db27c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.603611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e9e0e5-7b66-4d89-86b6-08ac757db27c-kube-api-access-bc6ws" (OuterVolumeSpecName: "kube-api-access-bc6ws") pod "15e9e0e5-7b66-4d89-86b6-08ac757db27c" (UID: "15e9e0e5-7b66-4d89-86b6-08ac757db27c"). InnerVolumeSpecName "kube-api-access-bc6ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.658650 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15e9e0e5-7b66-4d89-86b6-08ac757db27c" (UID: "15e9e0e5-7b66-4d89-86b6-08ac757db27c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.698672 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc6ws\" (UniqueName: \"kubernetes.io/projected/15e9e0e5-7b66-4d89-86b6-08ac757db27c-kube-api-access-bc6ws\") on node \"crc\" DevicePath \"\"" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.698703 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.698712 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e9e0e5-7b66-4d89-86b6-08ac757db27c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.789321 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq4b9"] Mar 13 12:39:01 crc kubenswrapper[4786]: I0313 12:39:01.789365 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wq4b9"] Mar 13 12:39:03 crc kubenswrapper[4786]: I0313 12:39:03.448259 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" path="/var/lib/kubelet/pods/15e9e0e5-7b66-4d89-86b6-08ac757db27c/volumes" Mar 13 12:39:04 crc kubenswrapper[4786]: I0313 12:39:04.441130 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:39:04 crc kubenswrapper[4786]: E0313 12:39:04.441442 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:39:15 crc kubenswrapper[4786]: I0313 12:39:15.440796 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:39:15 crc kubenswrapper[4786]: E0313 12:39:15.441669 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:39:29 crc kubenswrapper[4786]: I0313 12:39:29.441403 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:39:29 crc kubenswrapper[4786]: E0313 12:39:29.444069 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:39:43 crc kubenswrapper[4786]: I0313 12:39:43.444779 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:39:43 crc kubenswrapper[4786]: E0313 12:39:43.445552 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:39:58 crc kubenswrapper[4786]: I0313 12:39:58.441001 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:39:58 crc kubenswrapper[4786]: E0313 12:39:58.441846 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.151316 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556760-p7qz5"] Mar 13 12:40:00 crc kubenswrapper[4786]: E0313 12:40:00.153686 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="registry-server" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.153715 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="registry-server" Mar 13 12:40:00 crc kubenswrapper[4786]: E0313 12:40:00.153731 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="extract-utilities" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.153740 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="extract-utilities" Mar 13 12:40:00 crc kubenswrapper[4786]: E0313 12:40:00.153760 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="extract-content" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.153767 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="extract-content" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.153985 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e9e0e5-7b66-4d89-86b6-08ac757db27c" containerName="registry-server" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.154607 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.158375 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.158596 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.158834 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.161824 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-p7qz5"] Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.252570 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw85g\" (UniqueName: \"kubernetes.io/projected/41dadc3c-5d43-42e1-8c2d-856105843c8d-kube-api-access-cw85g\") pod \"auto-csr-approver-29556760-p7qz5\" (UID: \"41dadc3c-5d43-42e1-8c2d-856105843c8d\") " pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.354726 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw85g\" (UniqueName: \"kubernetes.io/projected/41dadc3c-5d43-42e1-8c2d-856105843c8d-kube-api-access-cw85g\") pod \"auto-csr-approver-29556760-p7qz5\" (UID: \"41dadc3c-5d43-42e1-8c2d-856105843c8d\") " pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.375149 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw85g\" (UniqueName: \"kubernetes.io/projected/41dadc3c-5d43-42e1-8c2d-856105843c8d-kube-api-access-cw85g\") pod \"auto-csr-approver-29556760-p7qz5\" (UID: \"41dadc3c-5d43-42e1-8c2d-856105843c8d\") " pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.481350 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:00 crc kubenswrapper[4786]: I0313 12:40:00.922784 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-p7qz5"] Mar 13 12:40:01 crc kubenswrapper[4786]: I0313 12:40:01.921515 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" event={"ID":"41dadc3c-5d43-42e1-8c2d-856105843c8d","Type":"ContainerStarted","Data":"fb5594401f025c22aa4117256a89aa4f50426d85448e54d1a563a64be07edc85"} Mar 13 12:40:02 crc kubenswrapper[4786]: I0313 12:40:02.930620 4786 generic.go:334] "Generic (PLEG): container finished" podID="41dadc3c-5d43-42e1-8c2d-856105843c8d" containerID="ec8d1addf9e2ffde656a9745a7f5202c12e0c9691d404a33e9e80f4861145dc0" exitCode=0 Mar 13 12:40:02 crc kubenswrapper[4786]: I0313 12:40:02.930676 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" event={"ID":"41dadc3c-5d43-42e1-8c2d-856105843c8d","Type":"ContainerDied","Data":"ec8d1addf9e2ffde656a9745a7f5202c12e0c9691d404a33e9e80f4861145dc0"} Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.209226 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.323522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw85g\" (UniqueName: \"kubernetes.io/projected/41dadc3c-5d43-42e1-8c2d-856105843c8d-kube-api-access-cw85g\") pod \"41dadc3c-5d43-42e1-8c2d-856105843c8d\" (UID: \"41dadc3c-5d43-42e1-8c2d-856105843c8d\") " Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.328935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dadc3c-5d43-42e1-8c2d-856105843c8d-kube-api-access-cw85g" (OuterVolumeSpecName: "kube-api-access-cw85g") pod "41dadc3c-5d43-42e1-8c2d-856105843c8d" (UID: "41dadc3c-5d43-42e1-8c2d-856105843c8d"). InnerVolumeSpecName "kube-api-access-cw85g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.425125 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw85g\" (UniqueName: \"kubernetes.io/projected/41dadc3c-5d43-42e1-8c2d-856105843c8d-kube-api-access-cw85g\") on node \"crc\" DevicePath \"\"" Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.945927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" event={"ID":"41dadc3c-5d43-42e1-8c2d-856105843c8d","Type":"ContainerDied","Data":"fb5594401f025c22aa4117256a89aa4f50426d85448e54d1a563a64be07edc85"} Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.945989 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb5594401f025c22aa4117256a89aa4f50426d85448e54d1a563a64be07edc85" Mar 13 12:40:04 crc kubenswrapper[4786]: I0313 12:40:04.946065 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-p7qz5" Mar 13 12:40:05 crc kubenswrapper[4786]: I0313 12:40:05.289908 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-qwk5s"] Mar 13 12:40:05 crc kubenswrapper[4786]: I0313 12:40:05.295782 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-qwk5s"] Mar 13 12:40:05 crc kubenswrapper[4786]: I0313 12:40:05.449783 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fce5a6-e739-4860-9e11-c874a6f2d232" path="/var/lib/kubelet/pods/c5fce5a6-e739-4860-9e11-c874a6f2d232/volumes" Mar 13 12:40:13 crc kubenswrapper[4786]: I0313 12:40:13.443839 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:40:13 crc kubenswrapper[4786]: E0313 12:40:13.446188 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:40:17 crc kubenswrapper[4786]: I0313 12:40:17.763046 4786 scope.go:117] "RemoveContainer" containerID="56051ec14e93ef5fa22ac947c2f78edaace8b4d527da3805e2f6103cf218d7e1" Mar 13 12:40:25 crc kubenswrapper[4786]: I0313 12:40:25.440441 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:40:25 crc kubenswrapper[4786]: E0313 12:40:25.441281 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:40:37 crc kubenswrapper[4786]: I0313 12:40:37.441211 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:40:37 crc kubenswrapper[4786]: E0313 12:40:37.442014 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:40:51 crc kubenswrapper[4786]: I0313 12:40:51.440839 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:40:51 crc kubenswrapper[4786]: E0313 12:40:51.441553 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:41:03 crc kubenswrapper[4786]: I0313 12:41:03.445785 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:41:03 crc kubenswrapper[4786]: E0313 12:41:03.447146 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:41:17 crc kubenswrapper[4786]: I0313 12:41:17.440985 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:41:17 crc kubenswrapper[4786]: E0313 12:41:17.441658 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:41:32 crc kubenswrapper[4786]: I0313 12:41:32.440440 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:41:32 crc kubenswrapper[4786]: E0313 12:41:32.441198 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:41:47 crc kubenswrapper[4786]: I0313 12:41:47.441667 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:41:47 crc kubenswrapper[4786]: E0313 12:41:47.442858 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:41:59 crc kubenswrapper[4786]: I0313 12:41:59.440755 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:41:59 crc kubenswrapper[4786]: E0313 12:41:59.441487 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.209545 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556762-xb6sx"] Mar 13 12:42:00 crc kubenswrapper[4786]: E0313 12:42:00.209854 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41dadc3c-5d43-42e1-8c2d-856105843c8d" containerName="oc" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.209869 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="41dadc3c-5d43-42e1-8c2d-856105843c8d" containerName="oc" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.210041 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="41dadc3c-5d43-42e1-8c2d-856105843c8d" containerName="oc" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.210484 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.213359 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.213684 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.224776 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.230339 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-xb6sx"] Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.325704 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcwq\" (UniqueName: \"kubernetes.io/projected/3c97978b-34cc-42ae-89b4-23165dcb1d53-kube-api-access-6qcwq\") pod \"auto-csr-approver-29556762-xb6sx\" (UID: \"3c97978b-34cc-42ae-89b4-23165dcb1d53\") " pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.426780 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcwq\" (UniqueName: \"kubernetes.io/projected/3c97978b-34cc-42ae-89b4-23165dcb1d53-kube-api-access-6qcwq\") pod \"auto-csr-approver-29556762-xb6sx\" (UID: \"3c97978b-34cc-42ae-89b4-23165dcb1d53\") " pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.448495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcwq\" (UniqueName: \"kubernetes.io/projected/3c97978b-34cc-42ae-89b4-23165dcb1d53-kube-api-access-6qcwq\") pod \"auto-csr-approver-29556762-xb6sx\" (UID: \"3c97978b-34cc-42ae-89b4-23165dcb1d53\") " pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.526458 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.960369 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-xb6sx"] Mar 13 12:42:00 crc kubenswrapper[4786]: I0313 12:42:00.966851 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:42:01 crc kubenswrapper[4786]: I0313 12:42:01.782790 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" event={"ID":"3c97978b-34cc-42ae-89b4-23165dcb1d53","Type":"ContainerStarted","Data":"55b5c0eadcacd470ffcd5fc061bb7bc14640a438d9cad7b882f0f619d47e8fd2"} Mar 13 12:42:02 crc kubenswrapper[4786]: I0313 12:42:02.794436 4786 generic.go:334] "Generic (PLEG): container finished" podID="3c97978b-34cc-42ae-89b4-23165dcb1d53" containerID="e13b6ee3b790ae3912adab646429895b82e943e7c8b6b9b4b81cdc38f494845c" exitCode=0 Mar 13 12:42:02 crc kubenswrapper[4786]: I0313 12:42:02.794543 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" event={"ID":"3c97978b-34cc-42ae-89b4-23165dcb1d53","Type":"ContainerDied","Data":"e13b6ee3b790ae3912adab646429895b82e943e7c8b6b9b4b81cdc38f494845c"} Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.053948 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.105268 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5hfpf"] Mar 13 12:42:04 crc kubenswrapper[4786]: E0313 12:42:04.105584 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c97978b-34cc-42ae-89b4-23165dcb1d53" containerName="oc" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.105603 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c97978b-34cc-42ae-89b4-23165dcb1d53" containerName="oc" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.105744 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c97978b-34cc-42ae-89b4-23165dcb1d53" containerName="oc" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.109320 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.127224 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hfpf"] Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.181551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qcwq\" (UniqueName: \"kubernetes.io/projected/3c97978b-34cc-42ae-89b4-23165dcb1d53-kube-api-access-6qcwq\") pod \"3c97978b-34cc-42ae-89b4-23165dcb1d53\" (UID: \"3c97978b-34cc-42ae-89b4-23165dcb1d53\") " Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.182006 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-catalog-content\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.182044 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-utilities\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.182084 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhhv\" (UniqueName: \"kubernetes.io/projected/d8cee64a-8278-4642-a224-27378f98fa7d-kube-api-access-bnhhv\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.195669 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c97978b-34cc-42ae-89b4-23165dcb1d53-kube-api-access-6qcwq" (OuterVolumeSpecName: "kube-api-access-6qcwq") pod "3c97978b-34cc-42ae-89b4-23165dcb1d53" (UID: "3c97978b-34cc-42ae-89b4-23165dcb1d53"). InnerVolumeSpecName "kube-api-access-6qcwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.283857 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-catalog-content\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.283936 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-utilities\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.283989 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhhv\" (UniqueName: \"kubernetes.io/projected/d8cee64a-8278-4642-a224-27378f98fa7d-kube-api-access-bnhhv\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.284095 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qcwq\" (UniqueName: \"kubernetes.io/projected/3c97978b-34cc-42ae-89b4-23165dcb1d53-kube-api-access-6qcwq\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.284435 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-catalog-content\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.284864 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-utilities\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.303213 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhhv\" (UniqueName: \"kubernetes.io/projected/d8cee64a-8278-4642-a224-27378f98fa7d-kube-api-access-bnhhv\") pod \"community-operators-5hfpf\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.427652 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.823546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" event={"ID":"3c97978b-34cc-42ae-89b4-23165dcb1d53","Type":"ContainerDied","Data":"55b5c0eadcacd470ffcd5fc061bb7bc14640a438d9cad7b882f0f619d47e8fd2"} Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.823985 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55b5c0eadcacd470ffcd5fc061bb7bc14640a438d9cad7b882f0f619d47e8fd2" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.824016 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-xb6sx" Mar 13 12:42:04 crc kubenswrapper[4786]: I0313 12:42:04.974588 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hfpf"] Mar 13 12:42:04 crc kubenswrapper[4786]: W0313 12:42:04.981358 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8cee64a_8278_4642_a224_27378f98fa7d.slice/crio-ba87a46e267b0209b97ab40135084c22fb6c4b90628f2b0d0ec39a1f94d81a5d WatchSource:0}: Error finding container ba87a46e267b0209b97ab40135084c22fb6c4b90628f2b0d0ec39a1f94d81a5d: Status 404 returned error can't find the container with id ba87a46e267b0209b97ab40135084c22fb6c4b90628f2b0d0ec39a1f94d81a5d Mar 13 12:42:05 crc kubenswrapper[4786]: I0313 12:42:05.130638 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-rc9m7"] Mar 13 12:42:05 crc kubenswrapper[4786]: I0313 12:42:05.136566 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-rc9m7"] Mar 13 12:42:05 crc kubenswrapper[4786]: I0313 12:42:05.451948 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740920fa-04bf-476f-88e9-b8d4abe078df" path="/var/lib/kubelet/pods/740920fa-04bf-476f-88e9-b8d4abe078df/volumes" Mar 13 12:42:05 crc kubenswrapper[4786]: I0313 12:42:05.831832 4786 generic.go:334] "Generic (PLEG): container finished" podID="d8cee64a-8278-4642-a224-27378f98fa7d" containerID="02a71efbd5a57d2888d7b5bcf5a2794166ea8b0bb5fa374bc24294c02be3697d" exitCode=0 Mar 13 12:42:05 crc kubenswrapper[4786]: I0313 12:42:05.831901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerDied","Data":"02a71efbd5a57d2888d7b5bcf5a2794166ea8b0bb5fa374bc24294c02be3697d"} Mar 13 12:42:05 crc kubenswrapper[4786]: I0313 12:42:05.831933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerStarted","Data":"ba87a46e267b0209b97ab40135084c22fb6c4b90628f2b0d0ec39a1f94d81a5d"} Mar 13 12:42:06 crc kubenswrapper[4786]: I0313 12:42:06.841641 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerStarted","Data":"3beedbcbff5b979d44a459844b6cf87f518b99d8666e6441599b95cd9170290a"} Mar 13 12:42:07 crc kubenswrapper[4786]: I0313 12:42:07.849389 4786 generic.go:334] "Generic (PLEG): container finished" podID="d8cee64a-8278-4642-a224-27378f98fa7d" containerID="3beedbcbff5b979d44a459844b6cf87f518b99d8666e6441599b95cd9170290a" exitCode=0 Mar 13 12:42:07 crc kubenswrapper[4786]: I0313 12:42:07.849497 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerDied","Data":"3beedbcbff5b979d44a459844b6cf87f518b99d8666e6441599b95cd9170290a"} Mar 13 12:42:08 crc kubenswrapper[4786]: I0313 12:42:08.858852 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerStarted","Data":"d33d3f369cc3c7a492719829b47eb776a6f2e710e63b75a59f2b3ca670e5d667"} Mar 13 12:42:08 crc kubenswrapper[4786]: I0313 12:42:08.883492 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5hfpf" podStartSLOduration=2.373463903 podStartE2EDuration="4.883471472s" podCreationTimestamp="2026-03-13 12:42:04 +0000 UTC" firstStartedPulling="2026-03-13 12:42:05.833973007 +0000 UTC m=+3313.113626454" lastFinishedPulling="2026-03-13 12:42:08.343980586 +0000 UTC m=+3315.623634023" observedRunningTime="2026-03-13 12:42:08.878183108 +0000 UTC m=+3316.157836565" watchObservedRunningTime="2026-03-13 12:42:08.883471472 +0000 UTC m=+3316.163124919" Mar 13 12:42:10 crc kubenswrapper[4786]: I0313 12:42:10.440594 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:42:10 crc kubenswrapper[4786]: E0313 12:42:10.440834 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:42:14 crc kubenswrapper[4786]: I0313 12:42:14.429081 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:14 crc kubenswrapper[4786]: I0313 12:42:14.430285 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:14 crc kubenswrapper[4786]: I0313 12:42:14.477375 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:14 crc kubenswrapper[4786]: I0313 12:42:14.944089 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:14 crc kubenswrapper[4786]: I0313 12:42:14.990262 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hfpf"] Mar 13 12:42:16 crc kubenswrapper[4786]: I0313 12:42:16.913690 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5hfpf" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="registry-server" containerID="cri-o://d33d3f369cc3c7a492719829b47eb776a6f2e710e63b75a59f2b3ca670e5d667" gracePeriod=2 Mar 13 12:42:17 crc kubenswrapper[4786]: I0313 12:42:17.853314 4786 scope.go:117] "RemoveContainer" containerID="7f0421a07ad07fead3a8af38068edfe7dc9e98a4544eb492a87346f25c483036" Mar 13 12:42:17 crc kubenswrapper[4786]: I0313 12:42:17.923198 4786 generic.go:334] "Generic (PLEG): container finished" podID="d8cee64a-8278-4642-a224-27378f98fa7d" containerID="d33d3f369cc3c7a492719829b47eb776a6f2e710e63b75a59f2b3ca670e5d667" exitCode=0 Mar 13 12:42:17 crc kubenswrapper[4786]: I0313 12:42:17.923279 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerDied","Data":"d33d3f369cc3c7a492719829b47eb776a6f2e710e63b75a59f2b3ca670e5d667"} Mar 13 12:42:17 crc kubenswrapper[4786]: I0313 12:42:17.997308 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.083546 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-catalog-content\") pod \"d8cee64a-8278-4642-a224-27378f98fa7d\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.083627 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-utilities\") pod \"d8cee64a-8278-4642-a224-27378f98fa7d\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.083660 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnhhv\" (UniqueName: \"kubernetes.io/projected/d8cee64a-8278-4642-a224-27378f98fa7d-kube-api-access-bnhhv\") pod \"d8cee64a-8278-4642-a224-27378f98fa7d\" (UID: \"d8cee64a-8278-4642-a224-27378f98fa7d\") " Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.084908 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-utilities" (OuterVolumeSpecName: "utilities") pod "d8cee64a-8278-4642-a224-27378f98fa7d" (UID: "d8cee64a-8278-4642-a224-27378f98fa7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.091781 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8cee64a-8278-4642-a224-27378f98fa7d-kube-api-access-bnhhv" (OuterVolumeSpecName: "kube-api-access-bnhhv") pod "d8cee64a-8278-4642-a224-27378f98fa7d" (UID: "d8cee64a-8278-4642-a224-27378f98fa7d"). InnerVolumeSpecName "kube-api-access-bnhhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.148397 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8cee64a-8278-4642-a224-27378f98fa7d" (UID: "d8cee64a-8278-4642-a224-27378f98fa7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.185564 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.185598 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8cee64a-8278-4642-a224-27378f98fa7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.185608 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnhhv\" (UniqueName: \"kubernetes.io/projected/d8cee64a-8278-4642-a224-27378f98fa7d-kube-api-access-bnhhv\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.937056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hfpf" event={"ID":"d8cee64a-8278-4642-a224-27378f98fa7d","Type":"ContainerDied","Data":"ba87a46e267b0209b97ab40135084c22fb6c4b90628f2b0d0ec39a1f94d81a5d"} Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.937395 4786 scope.go:117] "RemoveContainer" containerID="d33d3f369cc3c7a492719829b47eb776a6f2e710e63b75a59f2b3ca670e5d667" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.937146 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hfpf" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.967930 4786 scope.go:117] "RemoveContainer" containerID="3beedbcbff5b979d44a459844b6cf87f518b99d8666e6441599b95cd9170290a" Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.986099 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hfpf"] Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.992618 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5hfpf"] Mar 13 12:42:18 crc kubenswrapper[4786]: I0313 12:42:18.995360 4786 scope.go:117] "RemoveContainer" containerID="02a71efbd5a57d2888d7b5bcf5a2794166ea8b0bb5fa374bc24294c02be3697d" Mar 13 12:42:19 crc kubenswrapper[4786]: I0313 12:42:19.451250 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" path="/var/lib/kubelet/pods/d8cee64a-8278-4642-a224-27378f98fa7d/volumes" Mar 13 12:42:21 crc kubenswrapper[4786]: I0313 12:42:21.441533 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:42:21 crc kubenswrapper[4786]: E0313 12:42:21.442376 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:42:32 crc kubenswrapper[4786]: I0313 12:42:32.440987 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:42:32 crc kubenswrapper[4786]: E0313 12:42:32.441698 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:42:44 crc kubenswrapper[4786]: I0313 12:42:44.441000 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:42:44 crc kubenswrapper[4786]: E0313 12:42:44.441791 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.441057 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:42:58 crc kubenswrapper[4786]: E0313 12:42:58.441713 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.821360 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dxvgx"] Mar 13 12:42:58 crc kubenswrapper[4786]: E0313 12:42:58.821937 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="extract-utilities" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.822027 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="extract-utilities" Mar 13 12:42:58 crc kubenswrapper[4786]: E0313 12:42:58.822094 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="extract-content" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.822164 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="extract-content" Mar 13 12:42:58 crc kubenswrapper[4786]: E0313 12:42:58.822240 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="registry-server" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.822296 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="registry-server" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.822492 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8cee64a-8278-4642-a224-27378f98fa7d" containerName="registry-server" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.823491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.841970 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxvgx"] Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.964167 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2krpl\" (UniqueName: \"kubernetes.io/projected/3e903ee4-071c-4016-8f76-78ad4ec3623c-kube-api-access-2krpl\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.964693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-catalog-content\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:58 crc kubenswrapper[4786]: I0313 12:42:58.964747 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-utilities\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.066134 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krpl\" (UniqueName: \"kubernetes.io/projected/3e903ee4-071c-4016-8f76-78ad4ec3623c-kube-api-access-2krpl\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.066491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-catalog-content\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.066609 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-utilities\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.067040 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-catalog-content\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.067138 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-utilities\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.087266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krpl\" (UniqueName: \"kubernetes.io/projected/3e903ee4-071c-4016-8f76-78ad4ec3623c-kube-api-access-2krpl\") pod \"redhat-operators-dxvgx\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.149436 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:42:59 crc kubenswrapper[4786]: I0313 12:42:59.575580 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxvgx"] Mar 13 12:43:00 crc kubenswrapper[4786]: I0313 12:43:00.229993 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerID="d03a08aeeca8039c9f60e7c7b7c417d8fd0d4ee2264f1edb6b28da1f8d21a630" exitCode=0 Mar 13 12:43:00 crc kubenswrapper[4786]: I0313 12:43:00.230290 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxvgx" event={"ID":"3e903ee4-071c-4016-8f76-78ad4ec3623c","Type":"ContainerDied","Data":"d03a08aeeca8039c9f60e7c7b7c417d8fd0d4ee2264f1edb6b28da1f8d21a630"} Mar 13 12:43:00 crc kubenswrapper[4786]: I0313 12:43:00.230316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxvgx" event={"ID":"3e903ee4-071c-4016-8f76-78ad4ec3623c","Type":"ContainerStarted","Data":"c2440490901ffe09ec8daac6f0a0e3f219dce0e1a4ea2696f0af9d10dd400a46"} Mar 13 12:43:02 crc kubenswrapper[4786]: I0313 12:43:02.249103 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerID="973ba72e413aa4700b39d0efb209b38b156bd20922bd8bea5cb9f7e2545d76a9" exitCode=0 Mar 13 12:43:02 crc kubenswrapper[4786]: I0313 12:43:02.249188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxvgx" event={"ID":"3e903ee4-071c-4016-8f76-78ad4ec3623c","Type":"ContainerDied","Data":"973ba72e413aa4700b39d0efb209b38b156bd20922bd8bea5cb9f7e2545d76a9"} Mar 13 12:43:03 crc kubenswrapper[4786]: I0313 12:43:03.260086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxvgx" event={"ID":"3e903ee4-071c-4016-8f76-78ad4ec3623c","Type":"ContainerStarted","Data":"0d60ec99e19401da6e3649a5b5ae06ac1a9e352180bfa2fda6390b673db1b0a2"} Mar 13 12:43:03 crc kubenswrapper[4786]: I0313 12:43:03.284281 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dxvgx" podStartSLOduration=2.882724746 podStartE2EDuration="5.284258145s" podCreationTimestamp="2026-03-13 12:42:58 +0000 UTC" firstStartedPulling="2026-03-13 12:43:00.231519881 +0000 UTC m=+3367.511173338" lastFinishedPulling="2026-03-13 12:43:02.63305329 +0000 UTC m=+3369.912706737" observedRunningTime="2026-03-13 12:43:03.282752034 +0000 UTC m=+3370.562405491" watchObservedRunningTime="2026-03-13 12:43:03.284258145 +0000 UTC m=+3370.563911592" Mar 13 12:43:09 crc kubenswrapper[4786]: I0313 12:43:09.150279 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:43:09 crc kubenswrapper[4786]: I0313 12:43:09.150694 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:43:09 crc kubenswrapper[4786]: I0313 12:43:09.195595 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:43:09 crc kubenswrapper[4786]: I0313 12:43:09.341951 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:43:12 crc kubenswrapper[4786]: I0313 12:43:12.435636 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dxvgx"] Mar 13 12:43:12 crc kubenswrapper[4786]: I0313 12:43:12.436694 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dxvgx" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="registry-server" containerID="cri-o://0d60ec99e19401da6e3649a5b5ae06ac1a9e352180bfa2fda6390b673db1b0a2" gracePeriod=2 Mar 13 12:43:13 crc kubenswrapper[4786]: I0313 12:43:13.330341 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerID="0d60ec99e19401da6e3649a5b5ae06ac1a9e352180bfa2fda6390b673db1b0a2" exitCode=0 Mar 13 12:43:13 crc kubenswrapper[4786]: I0313 12:43:13.330419 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxvgx" event={"ID":"3e903ee4-071c-4016-8f76-78ad4ec3623c","Type":"ContainerDied","Data":"0d60ec99e19401da6e3649a5b5ae06ac1a9e352180bfa2fda6390b673db1b0a2"} Mar 13 12:43:13 crc kubenswrapper[4786]: I0313 12:43:13.445170 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:43:13 crc kubenswrapper[4786]: E0313 12:43:13.445385 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:43:13 crc kubenswrapper[4786]: I0313 12:43:13.908613 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.071457 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-utilities\") pod \"3e903ee4-071c-4016-8f76-78ad4ec3623c\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.071578 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-catalog-content\") pod \"3e903ee4-071c-4016-8f76-78ad4ec3623c\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.071674 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2krpl\" (UniqueName: \"kubernetes.io/projected/3e903ee4-071c-4016-8f76-78ad4ec3623c-kube-api-access-2krpl\") pod \"3e903ee4-071c-4016-8f76-78ad4ec3623c\" (UID: \"3e903ee4-071c-4016-8f76-78ad4ec3623c\") " Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.072562 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-utilities" (OuterVolumeSpecName: "utilities") pod "3e903ee4-071c-4016-8f76-78ad4ec3623c" (UID: "3e903ee4-071c-4016-8f76-78ad4ec3623c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.086531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e903ee4-071c-4016-8f76-78ad4ec3623c-kube-api-access-2krpl" (OuterVolumeSpecName: "kube-api-access-2krpl") pod "3e903ee4-071c-4016-8f76-78ad4ec3623c" (UID: "3e903ee4-071c-4016-8f76-78ad4ec3623c"). InnerVolumeSpecName "kube-api-access-2krpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.173841 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.173929 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2krpl\" (UniqueName: \"kubernetes.io/projected/3e903ee4-071c-4016-8f76-78ad4ec3623c-kube-api-access-2krpl\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.198812 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e903ee4-071c-4016-8f76-78ad4ec3623c" (UID: "3e903ee4-071c-4016-8f76-78ad4ec3623c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.274752 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e903ee4-071c-4016-8f76-78ad4ec3623c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.340158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxvgx" event={"ID":"3e903ee4-071c-4016-8f76-78ad4ec3623c","Type":"ContainerDied","Data":"c2440490901ffe09ec8daac6f0a0e3f219dce0e1a4ea2696f0af9d10dd400a46"} Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.340228 4786 scope.go:117] "RemoveContainer" containerID="0d60ec99e19401da6e3649a5b5ae06ac1a9e352180bfa2fda6390b673db1b0a2" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.340239 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxvgx" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.369750 4786 scope.go:117] "RemoveContainer" containerID="973ba72e413aa4700b39d0efb209b38b156bd20922bd8bea5cb9f7e2545d76a9" Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.373110 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dxvgx"] Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.380117 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dxvgx"] Mar 13 12:43:14 crc kubenswrapper[4786]: I0313 12:43:14.392027 4786 scope.go:117] "RemoveContainer" containerID="d03a08aeeca8039c9f60e7c7b7c417d8fd0d4ee2264f1edb6b28da1f8d21a630" Mar 13 12:43:15 crc kubenswrapper[4786]: I0313 12:43:15.451363 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" path="/var/lib/kubelet/pods/3e903ee4-071c-4016-8f76-78ad4ec3623c/volumes" Mar 13 12:43:24 crc kubenswrapper[4786]: I0313 12:43:24.440520 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:43:24 crc kubenswrapper[4786]: E0313 12:43:24.441111 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:43:38 crc kubenswrapper[4786]: I0313 12:43:38.440557 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:43:39 crc kubenswrapper[4786]: I0313 12:43:39.510713 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"bb1ee42ab47d25b0b7c8b74d842dae4ca653711715ffa1c99328b89a180c93ba"} Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.156587 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556764-8rbq4"] Mar 13 12:44:00 crc kubenswrapper[4786]: E0313 12:44:00.157490 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="registry-server" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.157506 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="registry-server" Mar 13 12:44:00 crc kubenswrapper[4786]: E0313 12:44:00.157535 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="extract-content" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.157546 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="extract-content" Mar 13 12:44:00 crc kubenswrapper[4786]: E0313 12:44:00.157573 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="extract-utilities" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.157584 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="extract-utilities" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.157803 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e903ee4-071c-4016-8f76-78ad4ec3623c" containerName="registry-server" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.158465 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.162006 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.162402 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.162823 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.168722 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-8rbq4"] Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.244345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6x92\" (UniqueName: \"kubernetes.io/projected/d6824633-eedd-469d-8601-9d1d591d1af1-kube-api-access-p6x92\") pod \"auto-csr-approver-29556764-8rbq4\" (UID: \"d6824633-eedd-469d-8601-9d1d591d1af1\") " pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.345866 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6x92\" (UniqueName: \"kubernetes.io/projected/d6824633-eedd-469d-8601-9d1d591d1af1-kube-api-access-p6x92\") pod \"auto-csr-approver-29556764-8rbq4\" (UID: \"d6824633-eedd-469d-8601-9d1d591d1af1\") " pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.365027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6x92\" (UniqueName: \"kubernetes.io/projected/d6824633-eedd-469d-8601-9d1d591d1af1-kube-api-access-p6x92\") pod \"auto-csr-approver-29556764-8rbq4\" (UID: \"d6824633-eedd-469d-8601-9d1d591d1af1\") " pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.488242 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:00 crc kubenswrapper[4786]: I0313 12:44:00.903495 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-8rbq4"] Mar 13 12:44:01 crc kubenswrapper[4786]: I0313 12:44:01.681189 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" event={"ID":"d6824633-eedd-469d-8601-9d1d591d1af1","Type":"ContainerStarted","Data":"8f51bfec54d46f4ec7fd0afd80cf607a4faf73e465f63a63d8c3cadcd64486e9"} Mar 13 12:44:02 crc kubenswrapper[4786]: I0313 12:44:02.689725 4786 generic.go:334] "Generic (PLEG): container finished" podID="d6824633-eedd-469d-8601-9d1d591d1af1" containerID="23105baea5b7da897e6f96af8407d3481dfb4ac024e19a351a9946a5803be46c" exitCode=0 Mar 13 12:44:02 crc kubenswrapper[4786]: I0313 12:44:02.689777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" event={"ID":"d6824633-eedd-469d-8601-9d1d591d1af1","Type":"ContainerDied","Data":"23105baea5b7da897e6f96af8407d3481dfb4ac024e19a351a9946a5803be46c"} Mar 13 12:44:03 crc kubenswrapper[4786]: I0313 12:44:03.981940 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:04 crc kubenswrapper[4786]: I0313 12:44:04.104749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6x92\" (UniqueName: \"kubernetes.io/projected/d6824633-eedd-469d-8601-9d1d591d1af1-kube-api-access-p6x92\") pod \"d6824633-eedd-469d-8601-9d1d591d1af1\" (UID: \"d6824633-eedd-469d-8601-9d1d591d1af1\") " Mar 13 12:44:04 crc kubenswrapper[4786]: I0313 12:44:04.110127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6824633-eedd-469d-8601-9d1d591d1af1-kube-api-access-p6x92" (OuterVolumeSpecName: "kube-api-access-p6x92") pod "d6824633-eedd-469d-8601-9d1d591d1af1" (UID: "d6824633-eedd-469d-8601-9d1d591d1af1"). InnerVolumeSpecName "kube-api-access-p6x92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:44:04 crc kubenswrapper[4786]: I0313 12:44:04.206308 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6x92\" (UniqueName: \"kubernetes.io/projected/d6824633-eedd-469d-8601-9d1d591d1af1-kube-api-access-p6x92\") on node \"crc\" DevicePath \"\"" Mar 13 12:44:04 crc kubenswrapper[4786]: I0313 12:44:04.705643 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" Mar 13 12:44:04 crc kubenswrapper[4786]: I0313 12:44:04.705614 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-8rbq4" event={"ID":"d6824633-eedd-469d-8601-9d1d591d1af1","Type":"ContainerDied","Data":"8f51bfec54d46f4ec7fd0afd80cf607a4faf73e465f63a63d8c3cadcd64486e9"} Mar 13 12:44:04 crc kubenswrapper[4786]: I0313 12:44:04.705916 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f51bfec54d46f4ec7fd0afd80cf607a4faf73e465f63a63d8c3cadcd64486e9" Mar 13 12:44:05 crc kubenswrapper[4786]: I0313 12:44:05.050533 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-xw6z4"] Mar 13 12:44:05 crc kubenswrapper[4786]: I0313 12:44:05.055518 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-xw6z4"] Mar 13 12:44:05 crc kubenswrapper[4786]: I0313 12:44:05.450056 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5447ce-6139-4ace-92f0-4b024b5d7cfb" path="/var/lib/kubelet/pods/3a5447ce-6139-4ace-92f0-4b024b5d7cfb/volumes" Mar 13 12:44:17 crc kubenswrapper[4786]: I0313 12:44:17.974407 4786 scope.go:117] "RemoveContainer" containerID="bce9864d9ae4a215cd60d7776f0bdc1de3c8bb11a6177e22fe632afc3a25a035" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.156514 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm"] Mar 13 12:45:00 crc kubenswrapper[4786]: E0313 12:45:00.157407 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6824633-eedd-469d-8601-9d1d591d1af1" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.157426 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6824633-eedd-469d-8601-9d1d591d1af1" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.157589 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6824633-eedd-469d-8601-9d1d591d1af1" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.158178 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.160743 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.161014 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.169422 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm"] Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.232771 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25d27285-1251-4bd5-ad74-9c17e3019ab3-secret-volume\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.232926 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5f4d\" (UniqueName: \"kubernetes.io/projected/25d27285-1251-4bd5-ad74-9c17e3019ab3-kube-api-access-t5f4d\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.233046 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25d27285-1251-4bd5-ad74-9c17e3019ab3-config-volume\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.334389 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25d27285-1251-4bd5-ad74-9c17e3019ab3-secret-volume\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.334438 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5f4d\" (UniqueName: \"kubernetes.io/projected/25d27285-1251-4bd5-ad74-9c17e3019ab3-kube-api-access-t5f4d\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.334475 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25d27285-1251-4bd5-ad74-9c17e3019ab3-config-volume\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.335694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25d27285-1251-4bd5-ad74-9c17e3019ab3-config-volume\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.340569 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25d27285-1251-4bd5-ad74-9c17e3019ab3-secret-volume\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.359459 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5f4d\" (UniqueName: \"kubernetes.io/projected/25d27285-1251-4bd5-ad74-9c17e3019ab3-kube-api-access-t5f4d\") pod \"collect-profiles-29556765-bblnm\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.486871 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:00 crc kubenswrapper[4786]: I0313 12:45:00.921216 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm"] Mar 13 12:45:01 crc kubenswrapper[4786]: I0313 12:45:01.124092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" event={"ID":"25d27285-1251-4bd5-ad74-9c17e3019ab3","Type":"ContainerStarted","Data":"e33ba0a12d99a0d37246c547ab524f0ec50fde8017f43b9aa4f80ca2d25d71ec"} Mar 13 12:45:01 crc kubenswrapper[4786]: I0313 12:45:01.124242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" event={"ID":"25d27285-1251-4bd5-ad74-9c17e3019ab3","Type":"ContainerStarted","Data":"58ed17918d0ab8b80963a7e224a201d4a6834b145b6f2d396485a8cb47223f2b"} Mar 13 12:45:01 crc kubenswrapper[4786]: I0313 12:45:01.141197 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" podStartSLOduration=1.141174614 podStartE2EDuration="1.141174614s" podCreationTimestamp="2026-03-13 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:45:01.138965864 +0000 UTC m=+3488.418619321" watchObservedRunningTime="2026-03-13 12:45:01.141174614 +0000 UTC m=+3488.420828081" Mar 13 12:45:02 crc kubenswrapper[4786]: I0313 12:45:02.143008 4786 generic.go:334] "Generic (PLEG): container finished" podID="25d27285-1251-4bd5-ad74-9c17e3019ab3" containerID="e33ba0a12d99a0d37246c547ab524f0ec50fde8017f43b9aa4f80ca2d25d71ec" exitCode=0 Mar 13 12:45:02 crc kubenswrapper[4786]: I0313 12:45:02.143208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" event={"ID":"25d27285-1251-4bd5-ad74-9c17e3019ab3","Type":"ContainerDied","Data":"e33ba0a12d99a0d37246c547ab524f0ec50fde8017f43b9aa4f80ca2d25d71ec"} Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.406871 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.489524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25d27285-1251-4bd5-ad74-9c17e3019ab3-config-volume\") pod \"25d27285-1251-4bd5-ad74-9c17e3019ab3\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.489602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25d27285-1251-4bd5-ad74-9c17e3019ab3-secret-volume\") pod \"25d27285-1251-4bd5-ad74-9c17e3019ab3\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.489656 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5f4d\" (UniqueName: \"kubernetes.io/projected/25d27285-1251-4bd5-ad74-9c17e3019ab3-kube-api-access-t5f4d\") pod \"25d27285-1251-4bd5-ad74-9c17e3019ab3\" (UID: \"25d27285-1251-4bd5-ad74-9c17e3019ab3\") " Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.491314 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d27285-1251-4bd5-ad74-9c17e3019ab3-config-volume" (OuterVolumeSpecName: "config-volume") pod "25d27285-1251-4bd5-ad74-9c17e3019ab3" (UID: "25d27285-1251-4bd5-ad74-9c17e3019ab3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.495362 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d27285-1251-4bd5-ad74-9c17e3019ab3-kube-api-access-t5f4d" (OuterVolumeSpecName: "kube-api-access-t5f4d") pod "25d27285-1251-4bd5-ad74-9c17e3019ab3" (UID: "25d27285-1251-4bd5-ad74-9c17e3019ab3"). InnerVolumeSpecName "kube-api-access-t5f4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.495619 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d27285-1251-4bd5-ad74-9c17e3019ab3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25d27285-1251-4bd5-ad74-9c17e3019ab3" (UID: "25d27285-1251-4bd5-ad74-9c17e3019ab3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.592403 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25d27285-1251-4bd5-ad74-9c17e3019ab3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.592475 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25d27285-1251-4bd5-ad74-9c17e3019ab3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:03 crc kubenswrapper[4786]: I0313 12:45:03.592504 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5f4d\" (UniqueName: \"kubernetes.io/projected/25d27285-1251-4bd5-ad74-9c17e3019ab3-kube-api-access-t5f4d\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:04 crc kubenswrapper[4786]: I0313 12:45:04.159768 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" event={"ID":"25d27285-1251-4bd5-ad74-9c17e3019ab3","Type":"ContainerDied","Data":"58ed17918d0ab8b80963a7e224a201d4a6834b145b6f2d396485a8cb47223f2b"} Mar 13 12:45:04 crc kubenswrapper[4786]: I0313 12:45:04.159822 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58ed17918d0ab8b80963a7e224a201d4a6834b145b6f2d396485a8cb47223f2b" Mar 13 12:45:04 crc kubenswrapper[4786]: I0313 12:45:04.159820 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-bblnm" Mar 13 12:45:04 crc kubenswrapper[4786]: I0313 12:45:04.221356 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l"] Mar 13 12:45:04 crc kubenswrapper[4786]: I0313 12:45:04.226393 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-hsr7l"] Mar 13 12:45:05 crc kubenswrapper[4786]: I0313 12:45:05.449670 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5" path="/var/lib/kubelet/pods/37f2b1db-8b0a-414d-a5f1-bfc6f39ef7a5/volumes" Mar 13 12:45:18 crc kubenswrapper[4786]: I0313 12:45:18.032570 4786 scope.go:117] "RemoveContainer" containerID="3270d775451c2ba76a2d69b1754926133289fe70578f3f6a85a8844e38dc2860" Mar 13 12:45:38 crc kubenswrapper[4786]: I0313 12:45:38.169702 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:45:38 crc kubenswrapper[4786]: I0313 12:45:38.170252 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.145983 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556766-nk2j7"] Mar 13 12:46:00 crc kubenswrapper[4786]: E0313 12:46:00.146779 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d27285-1251-4bd5-ad74-9c17e3019ab3" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.146791 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d27285-1251-4bd5-ad74-9c17e3019ab3" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.146952 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d27285-1251-4bd5-ad74-9c17e3019ab3" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.147373 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.149457 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.149457 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.150300 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.152524 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nk45\" (UniqueName: \"kubernetes.io/projected/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962-kube-api-access-6nk45\") pod \"auto-csr-approver-29556766-nk2j7\" (UID: \"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962\") " pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.157344 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-nk2j7"] Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.253987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nk45\" (UniqueName: \"kubernetes.io/projected/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962-kube-api-access-6nk45\") pod \"auto-csr-approver-29556766-nk2j7\" (UID: \"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962\") " pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.277231 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nk45\" (UniqueName: \"kubernetes.io/projected/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962-kube-api-access-6nk45\") pod \"auto-csr-approver-29556766-nk2j7\" (UID: \"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962\") " pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.462712 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:00 crc kubenswrapper[4786]: I0313 12:46:00.991335 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-nk2j7"] Mar 13 12:46:01 crc kubenswrapper[4786]: I0313 12:46:01.594603 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" event={"ID":"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962","Type":"ContainerStarted","Data":"18c2acc980e981a42d8ab0bedfd7719cc7ac80141a26395fb2795a17ea721169"} Mar 13 12:46:02 crc kubenswrapper[4786]: I0313 12:46:02.607702 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ded6d40-a34e-4d6a-8a09-39d7ad8c0962" containerID="8f79846e5b12fa73bc66821718ba94063708e54d09211f41499e125309f67d67" exitCode=0 Mar 13 12:46:02 crc kubenswrapper[4786]: I0313 12:46:02.608147 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" event={"ID":"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962","Type":"ContainerDied","Data":"8f79846e5b12fa73bc66821718ba94063708e54d09211f41499e125309f67d67"} Mar 13 12:46:03 crc kubenswrapper[4786]: I0313 12:46:03.861206 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.005214 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nk45\" (UniqueName: \"kubernetes.io/projected/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962-kube-api-access-6nk45\") pod \"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962\" (UID: \"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962\") " Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.017142 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962-kube-api-access-6nk45" (OuterVolumeSpecName: "kube-api-access-6nk45") pod "1ded6d40-a34e-4d6a-8a09-39d7ad8c0962" (UID: "1ded6d40-a34e-4d6a-8a09-39d7ad8c0962"). InnerVolumeSpecName "kube-api-access-6nk45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.106681 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nk45\" (UniqueName: \"kubernetes.io/projected/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962-kube-api-access-6nk45\") on node \"crc\" DevicePath \"\"" Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.620644 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" event={"ID":"1ded6d40-a34e-4d6a-8a09-39d7ad8c0962","Type":"ContainerDied","Data":"18c2acc980e981a42d8ab0bedfd7719cc7ac80141a26395fb2795a17ea721169"} Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.621010 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18c2acc980e981a42d8ab0bedfd7719cc7ac80141a26395fb2795a17ea721169" Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.620688 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-nk2j7" Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.929655 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-p7qz5"] Mar 13 12:46:04 crc kubenswrapper[4786]: I0313 12:46:04.935021 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-p7qz5"] Mar 13 12:46:05 crc kubenswrapper[4786]: I0313 12:46:05.447872 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41dadc3c-5d43-42e1-8c2d-856105843c8d" path="/var/lib/kubelet/pods/41dadc3c-5d43-42e1-8c2d-856105843c8d/volumes" Mar 13 12:46:08 crc kubenswrapper[4786]: I0313 12:46:08.169458 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:46:08 crc kubenswrapper[4786]: I0313 12:46:08.170800 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:18 crc kubenswrapper[4786]: I0313 12:46:18.082754 4786 scope.go:117] "RemoveContainer" containerID="ec8d1addf9e2ffde656a9745a7f5202c12e0c9691d404a33e9e80f4861145dc0" Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.169013 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.169531 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.169586 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.170174 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb1ee42ab47d25b0b7c8b74d842dae4ca653711715ffa1c99328b89a180c93ba"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.170240 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://bb1ee42ab47d25b0b7c8b74d842dae4ca653711715ffa1c99328b89a180c93ba" gracePeriod=600 Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.888749 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="bb1ee42ab47d25b0b7c8b74d842dae4ca653711715ffa1c99328b89a180c93ba" exitCode=0 Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.888794 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"bb1ee42ab47d25b0b7c8b74d842dae4ca653711715ffa1c99328b89a180c93ba"} Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.889481 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219"} Mar 13 12:46:38 crc kubenswrapper[4786]: I0313 12:46:38.889504 4786 scope.go:117] "RemoveContainer" containerID="6c4fd255cc3f3ceeb57b1bb58d92d2279e3dd5ad3f87d5cd9397b827d16c1086" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.135828 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556768-gzsm7"] Mar 13 12:48:00 crc kubenswrapper[4786]: E0313 12:48:00.136750 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ded6d40-a34e-4d6a-8a09-39d7ad8c0962" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.136768 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ded6d40-a34e-4d6a-8a09-39d7ad8c0962" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.136975 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ded6d40-a34e-4d6a-8a09-39d7ad8c0962" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.137564 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.141677 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.141874 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.142271 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.144744 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-gzsm7"] Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.192504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28wxr\" (UniqueName: \"kubernetes.io/projected/425e116d-47dc-4d62-a7c9-367ab1abc836-kube-api-access-28wxr\") pod \"auto-csr-approver-29556768-gzsm7\" (UID: \"425e116d-47dc-4d62-a7c9-367ab1abc836\") " pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.293483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28wxr\" (UniqueName: \"kubernetes.io/projected/425e116d-47dc-4d62-a7c9-367ab1abc836-kube-api-access-28wxr\") pod \"auto-csr-approver-29556768-gzsm7\" (UID: \"425e116d-47dc-4d62-a7c9-367ab1abc836\") " pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.311523 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28wxr\" (UniqueName: \"kubernetes.io/projected/425e116d-47dc-4d62-a7c9-367ab1abc836-kube-api-access-28wxr\") pod \"auto-csr-approver-29556768-gzsm7\" (UID: \"425e116d-47dc-4d62-a7c9-367ab1abc836\") " pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.462313 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.889663 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-gzsm7"] Mar 13 12:48:00 crc kubenswrapper[4786]: I0313 12:48:00.907232 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:48:01 crc kubenswrapper[4786]: I0313 12:48:01.571330 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" event={"ID":"425e116d-47dc-4d62-a7c9-367ab1abc836","Type":"ContainerStarted","Data":"b7c44984e6f55b6066c09d6b2720421e304dd5dddcd10d771b6261068d4fcf6f"} Mar 13 12:48:02 crc kubenswrapper[4786]: I0313 12:48:02.578095 4786 generic.go:334] "Generic (PLEG): container finished" podID="425e116d-47dc-4d62-a7c9-367ab1abc836" containerID="ce322772669a3991a6396ef3860ff6e4679c9953e192b1b3f120a359ff278b5c" exitCode=0 Mar 13 12:48:02 crc kubenswrapper[4786]: I0313 12:48:02.578380 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" event={"ID":"425e116d-47dc-4d62-a7c9-367ab1abc836","Type":"ContainerDied","Data":"ce322772669a3991a6396ef3860ff6e4679c9953e192b1b3f120a359ff278b5c"} Mar 13 12:48:03 crc kubenswrapper[4786]: I0313 12:48:03.831471 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:03 crc kubenswrapper[4786]: I0313 12:48:03.946015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28wxr\" (UniqueName: \"kubernetes.io/projected/425e116d-47dc-4d62-a7c9-367ab1abc836-kube-api-access-28wxr\") pod \"425e116d-47dc-4d62-a7c9-367ab1abc836\" (UID: \"425e116d-47dc-4d62-a7c9-367ab1abc836\") " Mar 13 12:48:03 crc kubenswrapper[4786]: I0313 12:48:03.950895 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425e116d-47dc-4d62-a7c9-367ab1abc836-kube-api-access-28wxr" (OuterVolumeSpecName: "kube-api-access-28wxr") pod "425e116d-47dc-4d62-a7c9-367ab1abc836" (UID: "425e116d-47dc-4d62-a7c9-367ab1abc836"). InnerVolumeSpecName "kube-api-access-28wxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:48:04 crc kubenswrapper[4786]: I0313 12:48:04.048175 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28wxr\" (UniqueName: \"kubernetes.io/projected/425e116d-47dc-4d62-a7c9-367ab1abc836-kube-api-access-28wxr\") on node \"crc\" DevicePath \"\"" Mar 13 12:48:04 crc kubenswrapper[4786]: I0313 12:48:04.593558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" event={"ID":"425e116d-47dc-4d62-a7c9-367ab1abc836","Type":"ContainerDied","Data":"b7c44984e6f55b6066c09d6b2720421e304dd5dddcd10d771b6261068d4fcf6f"} Mar 13 12:48:04 crc kubenswrapper[4786]: I0313 12:48:04.593610 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c44984e6f55b6066c09d6b2720421e304dd5dddcd10d771b6261068d4fcf6f" Mar 13 12:48:04 crc kubenswrapper[4786]: I0313 12:48:04.593631 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-gzsm7" Mar 13 12:48:04 crc kubenswrapper[4786]: I0313 12:48:04.898802 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-xb6sx"] Mar 13 12:48:04 crc kubenswrapper[4786]: I0313 12:48:04.905566 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-xb6sx"] Mar 13 12:48:05 crc kubenswrapper[4786]: I0313 12:48:05.451716 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c97978b-34cc-42ae-89b4-23165dcb1d53" path="/var/lib/kubelet/pods/3c97978b-34cc-42ae-89b4-23165dcb1d53/volumes" Mar 13 12:48:18 crc kubenswrapper[4786]: I0313 12:48:18.170060 4786 scope.go:117] "RemoveContainer" containerID="e13b6ee3b790ae3912adab646429895b82e943e7c8b6b9b4b81cdc38f494845c" Mar 13 12:48:38 crc kubenswrapper[4786]: I0313 12:48:38.169801 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:48:38 crc kubenswrapper[4786]: I0313 12:48:38.170685 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.579873 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5fm8"] Mar 13 12:49:01 crc kubenswrapper[4786]: E0313 12:49:01.581072 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425e116d-47dc-4d62-a7c9-367ab1abc836" containerName="oc" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.581090 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="425e116d-47dc-4d62-a7c9-367ab1abc836" containerName="oc" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.581286 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="425e116d-47dc-4d62-a7c9-367ab1abc836" containerName="oc" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.582649 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.594351 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5fm8"] Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.663765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tprgn\" (UniqueName: \"kubernetes.io/projected/4b42b967-dfc3-4e13-83ba-a867e862444a-kube-api-access-tprgn\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.664027 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-catalog-content\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.664232 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-utilities\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.765669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tprgn\" (UniqueName: \"kubernetes.io/projected/4b42b967-dfc3-4e13-83ba-a867e862444a-kube-api-access-tprgn\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.765772 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-catalog-content\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.765795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-utilities\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.766291 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-catalog-content\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.766346 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-utilities\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.788980 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tprgn\" (UniqueName: \"kubernetes.io/projected/4b42b967-dfc3-4e13-83ba-a867e862444a-kube-api-access-tprgn\") pod \"redhat-marketplace-m5fm8\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:01 crc kubenswrapper[4786]: I0313 12:49:01.961117 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:02 crc kubenswrapper[4786]: I0313 12:49:02.403834 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5fm8"] Mar 13 12:49:03 crc kubenswrapper[4786]: I0313 12:49:03.007798 4786 generic.go:334] "Generic (PLEG): container finished" podID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerID="b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1" exitCode=0 Mar 13 12:49:03 crc kubenswrapper[4786]: I0313 12:49:03.007939 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerDied","Data":"b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1"} Mar 13 12:49:03 crc kubenswrapper[4786]: I0313 12:49:03.009927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerStarted","Data":"5933a080c6ef14fe117ad31c3cdbc15a62178adbf6b755f1c65d9c3e6d92fdb5"} Mar 13 12:49:04 crc kubenswrapper[4786]: I0313 12:49:04.020478 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerStarted","Data":"5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d"} Mar 13 12:49:05 crc kubenswrapper[4786]: I0313 12:49:05.028845 4786 generic.go:334] "Generic (PLEG): container finished" podID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerID="5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d" exitCode=0 Mar 13 12:49:05 crc kubenswrapper[4786]: I0313 12:49:05.028954 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerDied","Data":"5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d"} Mar 13 12:49:06 crc kubenswrapper[4786]: I0313 12:49:06.039332 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerStarted","Data":"dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3"} Mar 13 12:49:06 crc kubenswrapper[4786]: I0313 12:49:06.064849 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5fm8" podStartSLOduration=2.568711962 podStartE2EDuration="5.06482799s" podCreationTimestamp="2026-03-13 12:49:01 +0000 UTC" firstStartedPulling="2026-03-13 12:49:03.009161067 +0000 UTC m=+3730.288814514" lastFinishedPulling="2026-03-13 12:49:05.505277095 +0000 UTC m=+3732.784930542" observedRunningTime="2026-03-13 12:49:06.061299313 +0000 UTC m=+3733.340952760" watchObservedRunningTime="2026-03-13 12:49:06.06482799 +0000 UTC m=+3733.344481437" Mar 13 12:49:08 crc kubenswrapper[4786]: I0313 12:49:08.168781 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:49:08 crc kubenswrapper[4786]: I0313 12:49:08.170246 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:11 crc kubenswrapper[4786]: I0313 12:49:11.962039 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:11 crc kubenswrapper[4786]: I0313 12:49:11.963534 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:12 crc kubenswrapper[4786]: I0313 12:49:12.038514 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:12 crc kubenswrapper[4786]: I0313 12:49:12.130099 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:12 crc kubenswrapper[4786]: I0313 12:49:12.272957 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5fm8"] Mar 13 12:49:14 crc kubenswrapper[4786]: I0313 12:49:14.098285 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m5fm8" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="registry-server" containerID="cri-o://dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3" gracePeriod=2 Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.038018 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.058679 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-utilities\") pod \"4b42b967-dfc3-4e13-83ba-a867e862444a\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.059569 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-catalog-content\") pod \"4b42b967-dfc3-4e13-83ba-a867e862444a\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.060120 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tprgn\" (UniqueName: \"kubernetes.io/projected/4b42b967-dfc3-4e13-83ba-a867e862444a-kube-api-access-tprgn\") pod \"4b42b967-dfc3-4e13-83ba-a867e862444a\" (UID: \"4b42b967-dfc3-4e13-83ba-a867e862444a\") " Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.061449 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-utilities" (OuterVolumeSpecName: "utilities") pod "4b42b967-dfc3-4e13-83ba-a867e862444a" (UID: "4b42b967-dfc3-4e13-83ba-a867e862444a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.072968 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b42b967-dfc3-4e13-83ba-a867e862444a-kube-api-access-tprgn" (OuterVolumeSpecName: "kube-api-access-tprgn") pod "4b42b967-dfc3-4e13-83ba-a867e862444a" (UID: "4b42b967-dfc3-4e13-83ba-a867e862444a"). InnerVolumeSpecName "kube-api-access-tprgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.104058 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b42b967-dfc3-4e13-83ba-a867e862444a" (UID: "4b42b967-dfc3-4e13-83ba-a867e862444a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.106488 4786 generic.go:334] "Generic (PLEG): container finished" podID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerID="dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3" exitCode=0 Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.106551 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5fm8" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.106556 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerDied","Data":"dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3"} Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.106594 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5fm8" event={"ID":"4b42b967-dfc3-4e13-83ba-a867e862444a","Type":"ContainerDied","Data":"5933a080c6ef14fe117ad31c3cdbc15a62178adbf6b755f1c65d9c3e6d92fdb5"} Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.106615 4786 scope.go:117] "RemoveContainer" containerID="dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.124654 4786 scope.go:117] "RemoveContainer" containerID="5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.135097 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5fm8"] Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.142567 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5fm8"] Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.163750 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.163807 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b42b967-dfc3-4e13-83ba-a867e862444a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.163822 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tprgn\" (UniqueName: \"kubernetes.io/projected/4b42b967-dfc3-4e13-83ba-a867e862444a-kube-api-access-tprgn\") on node \"crc\" DevicePath \"\"" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.167620 4786 scope.go:117] "RemoveContainer" containerID="b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.183757 4786 scope.go:117] "RemoveContainer" containerID="dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3" Mar 13 12:49:15 crc kubenswrapper[4786]: E0313 12:49:15.184223 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3\": container with ID starting with dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3 not found: ID does not exist" containerID="dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.184272 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3"} err="failed to get container status \"dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3\": rpc error: code = NotFound desc = could not find container \"dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3\": container with ID starting with dcbed493acdb96c4ec3ac6379d0a297f78bbd9b7a930c2610fe65b0b68cc0ab3 not found: ID does not exist" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.184300 4786 scope.go:117] "RemoveContainer" containerID="5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d" Mar 13 12:49:15 crc kubenswrapper[4786]: E0313 12:49:15.184632 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d\": container with ID starting with 5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d not found: ID does not exist" containerID="5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.184667 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d"} err="failed to get container status \"5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d\": rpc error: code = NotFound desc = could not find container \"5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d\": container with ID starting with 5b7a0d06d0dc3c78abc76ed6d95ebd3ae6a4579cbc1f682aa0f4bd6f0e63d59d not found: ID does not exist" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.184685 4786 scope.go:117] "RemoveContainer" containerID="b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1" Mar 13 12:49:15 crc kubenswrapper[4786]: E0313 12:49:15.185218 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1\": container with ID starting with b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1 not found: ID does not exist" containerID="b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.185248 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1"} err="failed to get container status \"b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1\": rpc error: code = NotFound desc = could not find container \"b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1\": container with ID starting with b038fc45b2904c25e7b8222a5dfd6f9de2780fcf97ef284ea349a57332c8caa1 not found: ID does not exist" Mar 13 12:49:15 crc kubenswrapper[4786]: I0313 12:49:15.450578 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" path="/var/lib/kubelet/pods/4b42b967-dfc3-4e13-83ba-a867e862444a/volumes" Mar 13 12:49:38 crc kubenswrapper[4786]: I0313 12:49:38.170062 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:49:38 crc kubenswrapper[4786]: I0313 12:49:38.171498 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:38 crc kubenswrapper[4786]: I0313 12:49:38.171581 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:49:38 crc kubenswrapper[4786]: I0313 12:49:38.173732 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:49:38 crc kubenswrapper[4786]: I0313 12:49:38.173813 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" gracePeriod=600 Mar 13 12:49:38 crc kubenswrapper[4786]: E0313 12:49:38.302397 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:49:39 crc kubenswrapper[4786]: I0313 12:49:39.292584 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" exitCode=0 Mar 13 12:49:39 crc kubenswrapper[4786]: I0313 12:49:39.292632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219"} Mar 13 12:49:39 crc kubenswrapper[4786]: I0313 12:49:39.292671 4786 scope.go:117] "RemoveContainer" containerID="bb1ee42ab47d25b0b7c8b74d842dae4ca653711715ffa1c99328b89a180c93ba" Mar 13 12:49:39 crc kubenswrapper[4786]: I0313 12:49:39.293238 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:49:39 crc kubenswrapper[4786]: E0313 12:49:39.293509 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:49:51 crc kubenswrapper[4786]: I0313 12:49:51.440910 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:49:51 crc kubenswrapper[4786]: E0313 12:49:51.441911 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.151019 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556770-zn8vk"] Mar 13 12:50:00 crc kubenswrapper[4786]: E0313 12:50:00.153393 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="extract-utilities" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.153515 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="extract-utilities" Mar 13 12:50:00 crc kubenswrapper[4786]: E0313 12:50:00.153623 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="registry-server" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.153707 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="registry-server" Mar 13 12:50:00 crc kubenswrapper[4786]: E0313 12:50:00.153790 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="extract-content" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.153863 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="extract-content" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.154128 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b42b967-dfc3-4e13-83ba-a867e862444a" containerName="registry-server" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.154729 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.156938 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.156938 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.157271 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.158616 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-zn8vk"] Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.320646 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sntd\" (UniqueName: \"kubernetes.io/projected/850104c6-c39d-4238-a99d-33127a1a23ca-kube-api-access-5sntd\") pod \"auto-csr-approver-29556770-zn8vk\" (UID: \"850104c6-c39d-4238-a99d-33127a1a23ca\") " pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.422520 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sntd\" (UniqueName: \"kubernetes.io/projected/850104c6-c39d-4238-a99d-33127a1a23ca-kube-api-access-5sntd\") pod \"auto-csr-approver-29556770-zn8vk\" (UID: \"850104c6-c39d-4238-a99d-33127a1a23ca\") " pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.442907 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sntd\" (UniqueName: \"kubernetes.io/projected/850104c6-c39d-4238-a99d-33127a1a23ca-kube-api-access-5sntd\") pod \"auto-csr-approver-29556770-zn8vk\" (UID: \"850104c6-c39d-4238-a99d-33127a1a23ca\") " pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.476672 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:00 crc kubenswrapper[4786]: I0313 12:50:00.905239 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-zn8vk"] Mar 13 12:50:01 crc kubenswrapper[4786]: I0313 12:50:01.461054 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" event={"ID":"850104c6-c39d-4238-a99d-33127a1a23ca","Type":"ContainerStarted","Data":"db947c9498a8a2f8c3282ade87635640b0f5aa54f81c2a52a05f4d47cd5667a2"} Mar 13 12:50:03 crc kubenswrapper[4786]: I0313 12:50:03.479630 4786 generic.go:334] "Generic (PLEG): container finished" podID="850104c6-c39d-4238-a99d-33127a1a23ca" containerID="0fa9f9952b300d0cf12ddc0f68180a562ac406b26ddbcedc1aa0b6c9356f16f4" exitCode=0 Mar 13 12:50:03 crc kubenswrapper[4786]: I0313 12:50:03.479732 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" event={"ID":"850104c6-c39d-4238-a99d-33127a1a23ca","Type":"ContainerDied","Data":"0fa9f9952b300d0cf12ddc0f68180a562ac406b26ddbcedc1aa0b6c9356f16f4"} Mar 13 12:50:04 crc kubenswrapper[4786]: I0313 12:50:04.827560 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:04 crc kubenswrapper[4786]: I0313 12:50:04.994179 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sntd\" (UniqueName: \"kubernetes.io/projected/850104c6-c39d-4238-a99d-33127a1a23ca-kube-api-access-5sntd\") pod \"850104c6-c39d-4238-a99d-33127a1a23ca\" (UID: \"850104c6-c39d-4238-a99d-33127a1a23ca\") " Mar 13 12:50:04 crc kubenswrapper[4786]: I0313 12:50:04.999642 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850104c6-c39d-4238-a99d-33127a1a23ca-kube-api-access-5sntd" (OuterVolumeSpecName: "kube-api-access-5sntd") pod "850104c6-c39d-4238-a99d-33127a1a23ca" (UID: "850104c6-c39d-4238-a99d-33127a1a23ca"). InnerVolumeSpecName "kube-api-access-5sntd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:50:05 crc kubenswrapper[4786]: I0313 12:50:05.096472 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sntd\" (UniqueName: \"kubernetes.io/projected/850104c6-c39d-4238-a99d-33127a1a23ca-kube-api-access-5sntd\") on node \"crc\" DevicePath \"\"" Mar 13 12:50:05 crc kubenswrapper[4786]: I0313 12:50:05.498214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" event={"ID":"850104c6-c39d-4238-a99d-33127a1a23ca","Type":"ContainerDied","Data":"db947c9498a8a2f8c3282ade87635640b0f5aa54f81c2a52a05f4d47cd5667a2"} Mar 13 12:50:05 crc kubenswrapper[4786]: I0313 12:50:05.498268 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db947c9498a8a2f8c3282ade87635640b0f5aa54f81c2a52a05f4d47cd5667a2" Mar 13 12:50:05 crc kubenswrapper[4786]: I0313 12:50:05.498282 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-zn8vk" Mar 13 12:50:05 crc kubenswrapper[4786]: I0313 12:50:05.891379 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-8rbq4"] Mar 13 12:50:05 crc kubenswrapper[4786]: I0313 12:50:05.897770 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-8rbq4"] Mar 13 12:50:06 crc kubenswrapper[4786]: I0313 12:50:06.441554 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:50:06 crc kubenswrapper[4786]: E0313 12:50:06.441824 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.300016 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6jfvm"] Mar 13 12:50:07 crc kubenswrapper[4786]: E0313 12:50:07.300796 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850104c6-c39d-4238-a99d-33127a1a23ca" containerName="oc" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.300820 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="850104c6-c39d-4238-a99d-33127a1a23ca" containerName="oc" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.301081 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="850104c6-c39d-4238-a99d-33127a1a23ca" containerName="oc" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.302696 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.309844 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jfvm"] Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.434435 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-catalog-content\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.434540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-utilities\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.434694 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4c4t\" (UniqueName: \"kubernetes.io/projected/70d7f4f2-ab94-4589-9d18-5566da8f54d4-kube-api-access-b4c4t\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.449535 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6824633-eedd-469d-8601-9d1d591d1af1" path="/var/lib/kubelet/pods/d6824633-eedd-469d-8601-9d1d591d1af1/volumes" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.536275 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-utilities\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.536366 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4c4t\" (UniqueName: \"kubernetes.io/projected/70d7f4f2-ab94-4589-9d18-5566da8f54d4-kube-api-access-b4c4t\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.536446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-catalog-content\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.536849 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-utilities\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.536986 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-catalog-content\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.554867 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4c4t\" (UniqueName: \"kubernetes.io/projected/70d7f4f2-ab94-4589-9d18-5566da8f54d4-kube-api-access-b4c4t\") pod \"certified-operators-6jfvm\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:07 crc kubenswrapper[4786]: I0313 12:50:07.624713 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:08 crc kubenswrapper[4786]: I0313 12:50:08.124403 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jfvm"] Mar 13 12:50:08 crc kubenswrapper[4786]: I0313 12:50:08.530839 4786 generic.go:334] "Generic (PLEG): container finished" podID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerID="103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0" exitCode=0 Mar 13 12:50:08 crc kubenswrapper[4786]: I0313 12:50:08.531061 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerDied","Data":"103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0"} Mar 13 12:50:08 crc kubenswrapper[4786]: I0313 12:50:08.531154 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerStarted","Data":"aaab742fde5db8a3aa0ad74580eebc2c2578e55f60e41f0ffca489302b6493b6"} Mar 13 12:50:09 crc kubenswrapper[4786]: I0313 12:50:09.540755 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerStarted","Data":"77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9"} Mar 13 12:50:10 crc kubenswrapper[4786]: I0313 12:50:10.550346 4786 generic.go:334] "Generic (PLEG): container finished" podID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerID="77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9" exitCode=0 Mar 13 12:50:10 crc kubenswrapper[4786]: I0313 12:50:10.550437 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerDied","Data":"77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9"} Mar 13 12:50:11 crc kubenswrapper[4786]: I0313 12:50:11.560337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerStarted","Data":"3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b"} Mar 13 12:50:17 crc kubenswrapper[4786]: I0313 12:50:17.625441 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:17 crc kubenswrapper[4786]: I0313 12:50:17.625764 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:17 crc kubenswrapper[4786]: I0313 12:50:17.685188 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:17 crc kubenswrapper[4786]: I0313 12:50:17.703509 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6jfvm" podStartSLOduration=8.2793819 podStartE2EDuration="10.703493322s" podCreationTimestamp="2026-03-13 12:50:07 +0000 UTC" firstStartedPulling="2026-03-13 12:50:08.535327992 +0000 UTC m=+3795.814981439" lastFinishedPulling="2026-03-13 12:50:10.959439414 +0000 UTC m=+3798.239092861" observedRunningTime="2026-03-13 12:50:11.600511194 +0000 UTC m=+3798.880164641" watchObservedRunningTime="2026-03-13 12:50:17.703493322 +0000 UTC m=+3804.983146769" Mar 13 12:50:18 crc kubenswrapper[4786]: I0313 12:50:18.261001 4786 scope.go:117] "RemoveContainer" containerID="23105baea5b7da897e6f96af8407d3481dfb4ac024e19a351a9946a5803be46c" Mar 13 12:50:18 crc kubenswrapper[4786]: I0313 12:50:18.650488 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:18 crc kubenswrapper[4786]: I0313 12:50:18.705559 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jfvm"] Mar 13 12:50:19 crc kubenswrapper[4786]: I0313 12:50:19.441051 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:50:19 crc kubenswrapper[4786]: E0313 12:50:19.441576 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:50:20 crc kubenswrapper[4786]: I0313 12:50:20.629301 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6jfvm" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="registry-server" containerID="cri-o://3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b" gracePeriod=2 Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.577893 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.659616 4786 generic.go:334] "Generic (PLEG): container finished" podID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerID="3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b" exitCode=0 Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.659958 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerDied","Data":"3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b"} Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.659991 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jfvm" event={"ID":"70d7f4f2-ab94-4589-9d18-5566da8f54d4","Type":"ContainerDied","Data":"aaab742fde5db8a3aa0ad74580eebc2c2578e55f60e41f0ffca489302b6493b6"} Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.660040 4786 scope.go:117] "RemoveContainer" containerID="3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.660167 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jfvm" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.678501 4786 scope.go:117] "RemoveContainer" containerID="77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.695799 4786 scope.go:117] "RemoveContainer" containerID="103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.728112 4786 scope.go:117] "RemoveContainer" containerID="3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b" Mar 13 12:50:21 crc kubenswrapper[4786]: E0313 12:50:21.728465 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b\": container with ID starting with 3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b not found: ID does not exist" containerID="3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.728567 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b"} err="failed to get container status \"3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b\": rpc error: code = NotFound desc = could not find container \"3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b\": container with ID starting with 3a5fd2ed6432ae2b0edd33d7e26824bc3286594d7b5ecd9b0d0b108b6c0e2f7b not found: ID does not exist" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.728758 4786 scope.go:117] "RemoveContainer" containerID="77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9" Mar 13 12:50:21 crc kubenswrapper[4786]: E0313 12:50:21.729110 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9\": container with ID starting with 77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9 not found: ID does not exist" containerID="77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.729447 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9"} err="failed to get container status \"77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9\": rpc error: code = NotFound desc = could not find container \"77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9\": container with ID starting with 77ac10a35f610fabc608180e94689f90937006b57da6d25ff598e5d82b7bb5f9 not found: ID does not exist" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.729532 4786 scope.go:117] "RemoveContainer" containerID="103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0" Mar 13 12:50:21 crc kubenswrapper[4786]: E0313 12:50:21.730282 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0\": container with ID starting with 103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0 not found: ID does not exist" containerID="103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.730304 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0"} err="failed to get container status \"103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0\": rpc error: code = NotFound desc = could not find container \"103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0\": container with ID starting with 103cb0c68c772e511b602d372ee6f98c4e657d3fcb82e9d234ac1063b09175c0 not found: ID does not exist" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.750257 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-utilities\") pod \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.750366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-catalog-content\") pod \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.751120 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-utilities" (OuterVolumeSpecName: "utilities") pod "70d7f4f2-ab94-4589-9d18-5566da8f54d4" (UID: "70d7f4f2-ab94-4589-9d18-5566da8f54d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.756063 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4c4t\" (UniqueName: \"kubernetes.io/projected/70d7f4f2-ab94-4589-9d18-5566da8f54d4-kube-api-access-b4c4t\") pod \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\" (UID: \"70d7f4f2-ab94-4589-9d18-5566da8f54d4\") " Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.756432 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.761436 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d7f4f2-ab94-4589-9d18-5566da8f54d4-kube-api-access-b4c4t" (OuterVolumeSpecName: "kube-api-access-b4c4t") pod "70d7f4f2-ab94-4589-9d18-5566da8f54d4" (UID: "70d7f4f2-ab94-4589-9d18-5566da8f54d4"). InnerVolumeSpecName "kube-api-access-b4c4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.814024 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70d7f4f2-ab94-4589-9d18-5566da8f54d4" (UID: "70d7f4f2-ab94-4589-9d18-5566da8f54d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.857536 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4c4t\" (UniqueName: \"kubernetes.io/projected/70d7f4f2-ab94-4589-9d18-5566da8f54d4-kube-api-access-b4c4t\") on node \"crc\" DevicePath \"\"" Mar 13 12:50:21 crc kubenswrapper[4786]: I0313 12:50:21.857979 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d7f4f2-ab94-4589-9d18-5566da8f54d4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:50:22 crc kubenswrapper[4786]: I0313 12:50:22.006188 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jfvm"] Mar 13 12:50:22 crc kubenswrapper[4786]: I0313 12:50:22.013366 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6jfvm"] Mar 13 12:50:23 crc kubenswrapper[4786]: I0313 12:50:23.459714 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" path="/var/lib/kubelet/pods/70d7f4f2-ab94-4589-9d18-5566da8f54d4/volumes" Mar 13 12:50:31 crc kubenswrapper[4786]: I0313 12:50:31.441186 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:50:31 crc kubenswrapper[4786]: E0313 12:50:31.442013 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:50:42 crc kubenswrapper[4786]: I0313 12:50:42.441272 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:50:42 crc kubenswrapper[4786]: E0313 12:50:42.442184 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:50:57 crc kubenswrapper[4786]: I0313 12:50:57.442813 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:50:57 crc kubenswrapper[4786]: E0313 12:50:57.444072 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:51:12 crc kubenswrapper[4786]: I0313 12:51:12.442156 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:51:12 crc kubenswrapper[4786]: E0313 12:51:12.443093 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:51:25 crc kubenswrapper[4786]: I0313 12:51:25.441955 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:51:25 crc kubenswrapper[4786]: E0313 12:51:25.443707 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:51:38 crc kubenswrapper[4786]: I0313 12:51:38.440388 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:51:38 crc kubenswrapper[4786]: E0313 12:51:38.440954 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:51:53 crc kubenswrapper[4786]: I0313 12:51:53.444496 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:51:53 crc kubenswrapper[4786]: E0313 12:51:53.445249 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.143975 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556772-fsvwz"] Mar 13 12:52:00 crc kubenswrapper[4786]: E0313 12:52:00.144787 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.144806 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4786]: E0313 12:52:00.144817 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="extract-content" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.144825 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="extract-content" Mar 13 12:52:00 crc kubenswrapper[4786]: E0313 12:52:00.144837 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="extract-utilities" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.144846 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="extract-utilities" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.145053 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d7f4f2-ab94-4589-9d18-5566da8f54d4" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.145534 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.147372 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.147448 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.148255 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.157098 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-fsvwz"] Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.346749 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-887lg\" (UniqueName: \"kubernetes.io/projected/84d1d9e1-ca9a-492a-9146-27655ec1429b-kube-api-access-887lg\") pod \"auto-csr-approver-29556772-fsvwz\" (UID: \"84d1d9e1-ca9a-492a-9146-27655ec1429b\") " pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.447455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-887lg\" (UniqueName: \"kubernetes.io/projected/84d1d9e1-ca9a-492a-9146-27655ec1429b-kube-api-access-887lg\") pod \"auto-csr-approver-29556772-fsvwz\" (UID: \"84d1d9e1-ca9a-492a-9146-27655ec1429b\") " pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.470004 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-887lg\" (UniqueName: \"kubernetes.io/projected/84d1d9e1-ca9a-492a-9146-27655ec1429b-kube-api-access-887lg\") pod \"auto-csr-approver-29556772-fsvwz\" (UID: \"84d1d9e1-ca9a-492a-9146-27655ec1429b\") " pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:00 crc kubenswrapper[4786]: I0313 12:52:00.762196 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:01 crc kubenswrapper[4786]: I0313 12:52:01.157083 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-fsvwz"] Mar 13 12:52:01 crc kubenswrapper[4786]: I0313 12:52:01.471514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" event={"ID":"84d1d9e1-ca9a-492a-9146-27655ec1429b","Type":"ContainerStarted","Data":"e761fa3ea09aaf7acd5870bc472dde5122228fe278b1e37505b7b9b617047aaa"} Mar 13 12:52:03 crc kubenswrapper[4786]: I0313 12:52:03.486671 4786 generic.go:334] "Generic (PLEG): container finished" podID="84d1d9e1-ca9a-492a-9146-27655ec1429b" containerID="6dde950952d366ef971a3dbd7d2ac3f2f5f76ad65f238719752796352f33d8e6" exitCode=0 Mar 13 12:52:03 crc kubenswrapper[4786]: I0313 12:52:03.486727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" event={"ID":"84d1d9e1-ca9a-492a-9146-27655ec1429b","Type":"ContainerDied","Data":"6dde950952d366ef971a3dbd7d2ac3f2f5f76ad65f238719752796352f33d8e6"} Mar 13 12:52:04 crc kubenswrapper[4786]: I0313 12:52:04.440714 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:52:04 crc kubenswrapper[4786]: E0313 12:52:04.440993 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:52:04 crc kubenswrapper[4786]: I0313 12:52:04.768131 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:04 crc kubenswrapper[4786]: I0313 12:52:04.911083 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-887lg\" (UniqueName: \"kubernetes.io/projected/84d1d9e1-ca9a-492a-9146-27655ec1429b-kube-api-access-887lg\") pod \"84d1d9e1-ca9a-492a-9146-27655ec1429b\" (UID: \"84d1d9e1-ca9a-492a-9146-27655ec1429b\") " Mar 13 12:52:04 crc kubenswrapper[4786]: I0313 12:52:04.918718 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d1d9e1-ca9a-492a-9146-27655ec1429b-kube-api-access-887lg" (OuterVolumeSpecName: "kube-api-access-887lg") pod "84d1d9e1-ca9a-492a-9146-27655ec1429b" (UID: "84d1d9e1-ca9a-492a-9146-27655ec1429b"). InnerVolumeSpecName "kube-api-access-887lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:05 crc kubenswrapper[4786]: I0313 12:52:05.013789 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-887lg\" (UniqueName: \"kubernetes.io/projected/84d1d9e1-ca9a-492a-9146-27655ec1429b-kube-api-access-887lg\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:05 crc kubenswrapper[4786]: I0313 12:52:05.508277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" event={"ID":"84d1d9e1-ca9a-492a-9146-27655ec1429b","Type":"ContainerDied","Data":"e761fa3ea09aaf7acd5870bc472dde5122228fe278b1e37505b7b9b617047aaa"} Mar 13 12:52:05 crc kubenswrapper[4786]: I0313 12:52:05.508628 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e761fa3ea09aaf7acd5870bc472dde5122228fe278b1e37505b7b9b617047aaa" Mar 13 12:52:05 crc kubenswrapper[4786]: I0313 12:52:05.508373 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-fsvwz" Mar 13 12:52:05 crc kubenswrapper[4786]: I0313 12:52:05.839070 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-nk2j7"] Mar 13 12:52:05 crc kubenswrapper[4786]: I0313 12:52:05.847137 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-nk2j7"] Mar 13 12:52:07 crc kubenswrapper[4786]: I0313 12:52:07.456066 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ded6d40-a34e-4d6a-8a09-39d7ad8c0962" path="/var/lib/kubelet/pods/1ded6d40-a34e-4d6a-8a09-39d7ad8c0962/volumes" Mar 13 12:52:15 crc kubenswrapper[4786]: I0313 12:52:15.441102 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:52:15 crc kubenswrapper[4786]: E0313 12:52:15.441696 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:52:18 crc kubenswrapper[4786]: I0313 12:52:18.370448 4786 scope.go:117] "RemoveContainer" containerID="8f79846e5b12fa73bc66821718ba94063708e54d09211f41499e125309f67d67" Mar 13 12:52:29 crc kubenswrapper[4786]: I0313 12:52:29.441156 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:52:29 crc kubenswrapper[4786]: E0313 12:52:29.441816 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:52:43 crc kubenswrapper[4786]: I0313 12:52:43.448163 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:52:43 crc kubenswrapper[4786]: E0313 12:52:43.449082 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:52:57 crc kubenswrapper[4786]: I0313 12:52:57.441031 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:52:57 crc kubenswrapper[4786]: E0313 12:52:57.441774 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:53:12 crc kubenswrapper[4786]: I0313 12:53:12.440505 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:53:12 crc kubenswrapper[4786]: E0313 12:53:12.441178 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:53:23 crc kubenswrapper[4786]: I0313 12:53:23.444806 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:53:23 crc kubenswrapper[4786]: E0313 12:53:23.445529 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:53:35 crc kubenswrapper[4786]: I0313 12:53:35.440755 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:53:35 crc kubenswrapper[4786]: E0313 12:53:35.441427 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:53:47 crc kubenswrapper[4786]: I0313 12:53:47.446478 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:53:47 crc kubenswrapper[4786]: E0313 12:53:47.447264 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:53:59 crc kubenswrapper[4786]: I0313 12:53:59.441234 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:53:59 crc kubenswrapper[4786]: E0313 12:53:59.441951 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.137174 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556774-lqx4p"] Mar 13 12:54:00 crc kubenswrapper[4786]: E0313 12:54:00.137790 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d9e1-ca9a-492a-9146-27655ec1429b" containerName="oc" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.137891 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d9e1-ca9a-492a-9146-27655ec1429b" containerName="oc" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.138351 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d9e1-ca9a-492a-9146-27655ec1429b" containerName="oc" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.138987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.140744 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.140817 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.141510 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.151288 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-lqx4p"] Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.206014 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpb6x\" (UniqueName: \"kubernetes.io/projected/7a72201b-4152-4903-b5a1-65c8905f77a4-kube-api-access-lpb6x\") pod \"auto-csr-approver-29556774-lqx4p\" (UID: \"7a72201b-4152-4903-b5a1-65c8905f77a4\") " pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.307921 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpb6x\" (UniqueName: \"kubernetes.io/projected/7a72201b-4152-4903-b5a1-65c8905f77a4-kube-api-access-lpb6x\") pod \"auto-csr-approver-29556774-lqx4p\" (UID: \"7a72201b-4152-4903-b5a1-65c8905f77a4\") " pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.336653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpb6x\" (UniqueName: \"kubernetes.io/projected/7a72201b-4152-4903-b5a1-65c8905f77a4-kube-api-access-lpb6x\") pod \"auto-csr-approver-29556774-lqx4p\" (UID: \"7a72201b-4152-4903-b5a1-65c8905f77a4\") " pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.460273 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.901212 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-lqx4p"] Mar 13 12:54:00 crc kubenswrapper[4786]: I0313 12:54:00.915673 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:54:01 crc kubenswrapper[4786]: I0313 12:54:01.640679 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" event={"ID":"7a72201b-4152-4903-b5a1-65c8905f77a4","Type":"ContainerStarted","Data":"74e7bec19b477e0c5aef944a585b5a1ca4fe62d51e0b537e9c2764c44d546698"} Mar 13 12:54:02 crc kubenswrapper[4786]: I0313 12:54:02.648509 4786 generic.go:334] "Generic (PLEG): container finished" podID="7a72201b-4152-4903-b5a1-65c8905f77a4" containerID="a16ed3cb684e5824f6ff5eb7374493379ab860c7cba8b191bc760603b3322ffb" exitCode=0 Mar 13 12:54:02 crc kubenswrapper[4786]: I0313 12:54:02.648686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" event={"ID":"7a72201b-4152-4903-b5a1-65c8905f77a4","Type":"ContainerDied","Data":"a16ed3cb684e5824f6ff5eb7374493379ab860c7cba8b191bc760603b3322ffb"} Mar 13 12:54:03 crc kubenswrapper[4786]: I0313 12:54:03.928750 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:04 crc kubenswrapper[4786]: I0313 12:54:04.059290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpb6x\" (UniqueName: \"kubernetes.io/projected/7a72201b-4152-4903-b5a1-65c8905f77a4-kube-api-access-lpb6x\") pod \"7a72201b-4152-4903-b5a1-65c8905f77a4\" (UID: \"7a72201b-4152-4903-b5a1-65c8905f77a4\") " Mar 13 12:54:04 crc kubenswrapper[4786]: I0313 12:54:04.064724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a72201b-4152-4903-b5a1-65c8905f77a4-kube-api-access-lpb6x" (OuterVolumeSpecName: "kube-api-access-lpb6x") pod "7a72201b-4152-4903-b5a1-65c8905f77a4" (UID: "7a72201b-4152-4903-b5a1-65c8905f77a4"). InnerVolumeSpecName "kube-api-access-lpb6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:54:04 crc kubenswrapper[4786]: I0313 12:54:04.160512 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpb6x\" (UniqueName: \"kubernetes.io/projected/7a72201b-4152-4903-b5a1-65c8905f77a4-kube-api-access-lpb6x\") on node \"crc\" DevicePath \"\"" Mar 13 12:54:04 crc kubenswrapper[4786]: I0313 12:54:04.662324 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" event={"ID":"7a72201b-4152-4903-b5a1-65c8905f77a4","Type":"ContainerDied","Data":"74e7bec19b477e0c5aef944a585b5a1ca4fe62d51e0b537e9c2764c44d546698"} Mar 13 12:54:04 crc kubenswrapper[4786]: I0313 12:54:04.662368 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e7bec19b477e0c5aef944a585b5a1ca4fe62d51e0b537e9c2764c44d546698" Mar 13 12:54:04 crc kubenswrapper[4786]: I0313 12:54:04.662429 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-lqx4p" Mar 13 12:54:05 crc kubenswrapper[4786]: I0313 12:54:05.000457 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-gzsm7"] Mar 13 12:54:05 crc kubenswrapper[4786]: I0313 12:54:05.005084 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-gzsm7"] Mar 13 12:54:05 crc kubenswrapper[4786]: I0313 12:54:05.451959 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425e116d-47dc-4d62-a7c9-367ab1abc836" path="/var/lib/kubelet/pods/425e116d-47dc-4d62-a7c9-367ab1abc836/volumes" Mar 13 12:54:12 crc kubenswrapper[4786]: I0313 12:54:12.440408 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:54:12 crc kubenswrapper[4786]: E0313 12:54:12.441002 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:54:18 crc kubenswrapper[4786]: I0313 12:54:18.456368 4786 scope.go:117] "RemoveContainer" containerID="ce322772669a3991a6396ef3860ff6e4679c9953e192b1b3f120a359ff278b5c" Mar 13 12:54:26 crc kubenswrapper[4786]: I0313 12:54:26.440943 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:54:26 crc kubenswrapper[4786]: E0313 12:54:26.441626 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:54:37 crc kubenswrapper[4786]: I0313 12:54:37.441468 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:54:37 crc kubenswrapper[4786]: E0313 12:54:37.442326 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 12:54:48 crc kubenswrapper[4786]: I0313 12:54:48.441300 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:54:48 crc kubenswrapper[4786]: I0313 12:54:48.974808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"06f2398628635d205a73435e995f3165f17e4749a03d0c6f5e7bcd488c6712f1"} Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.280268 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b5knd"] Mar 13 12:55:20 crc kubenswrapper[4786]: E0313 12:55:20.281485 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a72201b-4152-4903-b5a1-65c8905f77a4" containerName="oc" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.281500 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a72201b-4152-4903-b5a1-65c8905f77a4" containerName="oc" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.281688 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a72201b-4152-4903-b5a1-65c8905f77a4" containerName="oc" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.282994 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.288131 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5knd"] Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.473142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-catalog-content\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.473206 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-utilities\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.473360 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnnf\" (UniqueName: \"kubernetes.io/projected/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-kube-api-access-nnnnf\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.574345 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-catalog-content\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.574413 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-utilities\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.574474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnnf\" (UniqueName: \"kubernetes.io/projected/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-kube-api-access-nnnnf\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.575094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-catalog-content\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.575137 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-utilities\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.594724 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnnf\" (UniqueName: \"kubernetes.io/projected/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-kube-api-access-nnnnf\") pod \"community-operators-b5knd\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:20 crc kubenswrapper[4786]: I0313 12:55:20.704524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:21 crc kubenswrapper[4786]: I0313 12:55:21.173478 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5knd"] Mar 13 12:55:21 crc kubenswrapper[4786]: I0313 12:55:21.211707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerStarted","Data":"b266b0a9ed838056d28174d581986559e8e9967eb9cddb8eaa61eba55fd39530"} Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.220446 4786 generic.go:334] "Generic (PLEG): container finished" podID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerID="28e00ee1fcdf01479e3bab0d2781638c1cd6b0e86ed09496823077cba2302e9a" exitCode=0 Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.220563 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerDied","Data":"28e00ee1fcdf01479e3bab0d2781638c1cd6b0e86ed09496823077cba2302e9a"} Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.672716 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fswm7"] Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.674623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.687719 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fswm7"] Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.811661 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-utilities\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.811739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-catalog-content\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.811778 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b766b\" (UniqueName: \"kubernetes.io/projected/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-kube-api-access-b766b\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.913841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-utilities\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.913967 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-catalog-content\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.914046 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b766b\" (UniqueName: \"kubernetes.io/projected/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-kube-api-access-b766b\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.914835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-utilities\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:22 crc kubenswrapper[4786]: I0313 12:55:22.915253 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-catalog-content\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:23 crc kubenswrapper[4786]: I0313 12:55:23.230149 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerStarted","Data":"f4b0ab2cc98ccfe21c52a0001102289c485de0caf0052cad18c0467f37e519ce"} Mar 13 12:55:23 crc kubenswrapper[4786]: I0313 12:55:23.240956 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b766b\" (UniqueName: \"kubernetes.io/projected/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-kube-api-access-b766b\") pod \"redhat-operators-fswm7\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:23 crc kubenswrapper[4786]: I0313 12:55:23.290177 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:23 crc kubenswrapper[4786]: I0313 12:55:23.670088 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fswm7"] Mar 13 12:55:24 crc kubenswrapper[4786]: I0313 12:55:24.239017 4786 generic.go:334] "Generic (PLEG): container finished" podID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerID="1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b" exitCode=0 Mar 13 12:55:24 crc kubenswrapper[4786]: I0313 12:55:24.239491 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerDied","Data":"1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b"} Mar 13 12:55:24 crc kubenswrapper[4786]: I0313 12:55:24.239525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerStarted","Data":"bd645cf207a105e75b91689cce7ab3f9e960d4d5791a5838e00cbd2951151b25"} Mar 13 12:55:24 crc kubenswrapper[4786]: I0313 12:55:24.243402 4786 generic.go:334] "Generic (PLEG): container finished" podID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerID="f4b0ab2cc98ccfe21c52a0001102289c485de0caf0052cad18c0467f37e519ce" exitCode=0 Mar 13 12:55:24 crc kubenswrapper[4786]: I0313 12:55:24.243460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerDied","Data":"f4b0ab2cc98ccfe21c52a0001102289c485de0caf0052cad18c0467f37e519ce"} Mar 13 12:55:25 crc kubenswrapper[4786]: I0313 12:55:25.253016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerStarted","Data":"4219333a717cb2258ca1604ee770cdc4d3d048d1dd85b89b36b939859e6cdf89"} Mar 13 12:55:25 crc kubenswrapper[4786]: I0313 12:55:25.256182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerStarted","Data":"cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc"} Mar 13 12:55:25 crc kubenswrapper[4786]: I0313 12:55:25.275388 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b5knd" podStartSLOduration=2.582056903 podStartE2EDuration="5.275361563s" podCreationTimestamp="2026-03-13 12:55:20 +0000 UTC" firstStartedPulling="2026-03-13 12:55:22.223835794 +0000 UTC m=+4109.503489231" lastFinishedPulling="2026-03-13 12:55:24.917140444 +0000 UTC m=+4112.196793891" observedRunningTime="2026-03-13 12:55:25.269652217 +0000 UTC m=+4112.549305704" watchObservedRunningTime="2026-03-13 12:55:25.275361563 +0000 UTC m=+4112.555015050" Mar 13 12:55:26 crc kubenswrapper[4786]: I0313 12:55:26.266655 4786 generic.go:334] "Generic (PLEG): container finished" podID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerID="cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc" exitCode=0 Mar 13 12:55:26 crc kubenswrapper[4786]: I0313 12:55:26.266825 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerDied","Data":"cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc"} Mar 13 12:55:27 crc kubenswrapper[4786]: I0313 12:55:27.289757 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerStarted","Data":"d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817"} Mar 13 12:55:27 crc kubenswrapper[4786]: I0313 12:55:27.321821 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fswm7" podStartSLOduration=2.859248343 podStartE2EDuration="5.321806786s" podCreationTimestamp="2026-03-13 12:55:22 +0000 UTC" firstStartedPulling="2026-03-13 12:55:24.240948215 +0000 UTC m=+4111.520601662" lastFinishedPulling="2026-03-13 12:55:26.703506658 +0000 UTC m=+4113.983160105" observedRunningTime="2026-03-13 12:55:27.318952679 +0000 UTC m=+4114.598606146" watchObservedRunningTime="2026-03-13 12:55:27.321806786 +0000 UTC m=+4114.601460223" Mar 13 12:55:30 crc kubenswrapper[4786]: I0313 12:55:30.704801 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:30 crc kubenswrapper[4786]: I0313 12:55:30.705183 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:30 crc kubenswrapper[4786]: I0313 12:55:30.766653 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:31 crc kubenswrapper[4786]: I0313 12:55:31.375169 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:33 crc kubenswrapper[4786]: I0313 12:55:33.291414 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:33 crc kubenswrapper[4786]: I0313 12:55:33.291529 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:34 crc kubenswrapper[4786]: I0313 12:55:34.362843 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fswm7" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="registry-server" probeResult="failure" output=< Mar 13 12:55:34 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 12:55:34 crc kubenswrapper[4786]: > Mar 13 12:55:34 crc kubenswrapper[4786]: I0313 12:55:34.870974 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5knd"] Mar 13 12:55:34 crc kubenswrapper[4786]: I0313 12:55:34.871467 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b5knd" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="registry-server" containerID="cri-o://4219333a717cb2258ca1604ee770cdc4d3d048d1dd85b89b36b939859e6cdf89" gracePeriod=2 Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.359745 4786 generic.go:334] "Generic (PLEG): container finished" podID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerID="4219333a717cb2258ca1604ee770cdc4d3d048d1dd85b89b36b939859e6cdf89" exitCode=0 Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.359822 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerDied","Data":"4219333a717cb2258ca1604ee770cdc4d3d048d1dd85b89b36b939859e6cdf89"} Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.685025 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.805814 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-catalog-content\") pod \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.805906 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-utilities\") pod \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.806027 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnnnf\" (UniqueName: \"kubernetes.io/projected/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-kube-api-access-nnnnf\") pod \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\" (UID: \"edcbe8a0-ccb8-4fb2-9a57-118403205c9d\") " Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.807528 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-utilities" (OuterVolumeSpecName: "utilities") pod "edcbe8a0-ccb8-4fb2-9a57-118403205c9d" (UID: "edcbe8a0-ccb8-4fb2-9a57-118403205c9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.812972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-kube-api-access-nnnnf" (OuterVolumeSpecName: "kube-api-access-nnnnf") pod "edcbe8a0-ccb8-4fb2-9a57-118403205c9d" (UID: "edcbe8a0-ccb8-4fb2-9a57-118403205c9d"). InnerVolumeSpecName "kube-api-access-nnnnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.866281 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edcbe8a0-ccb8-4fb2-9a57-118403205c9d" (UID: "edcbe8a0-ccb8-4fb2-9a57-118403205c9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.907737 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnnnf\" (UniqueName: \"kubernetes.io/projected/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-kube-api-access-nnnnf\") on node \"crc\" DevicePath \"\"" Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.907790 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:55:35 crc kubenswrapper[4786]: I0313 12:55:35.907802 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcbe8a0-ccb8-4fb2-9a57-118403205c9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.368431 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5knd" event={"ID":"edcbe8a0-ccb8-4fb2-9a57-118403205c9d","Type":"ContainerDied","Data":"b266b0a9ed838056d28174d581986559e8e9967eb9cddb8eaa61eba55fd39530"} Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.368719 4786 scope.go:117] "RemoveContainer" containerID="4219333a717cb2258ca1604ee770cdc4d3d048d1dd85b89b36b939859e6cdf89" Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.368518 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5knd" Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.400644 4786 scope.go:117] "RemoveContainer" containerID="f4b0ab2cc98ccfe21c52a0001102289c485de0caf0052cad18c0467f37e519ce" Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.403001 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5knd"] Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.407920 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b5knd"] Mar 13 12:55:36 crc kubenswrapper[4786]: I0313 12:55:36.439068 4786 scope.go:117] "RemoveContainer" containerID="28e00ee1fcdf01479e3bab0d2781638c1cd6b0e86ed09496823077cba2302e9a" Mar 13 12:55:37 crc kubenswrapper[4786]: I0313 12:55:37.454008 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" path="/var/lib/kubelet/pods/edcbe8a0-ccb8-4fb2-9a57-118403205c9d/volumes" Mar 13 12:55:43 crc kubenswrapper[4786]: I0313 12:55:43.359251 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:43 crc kubenswrapper[4786]: I0313 12:55:43.401873 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:43 crc kubenswrapper[4786]: I0313 12:55:43.591098 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fswm7"] Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.434766 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fswm7" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="registry-server" containerID="cri-o://d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817" gracePeriod=2 Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.824293 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.836723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-utilities\") pod \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.837751 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-utilities" (OuterVolumeSpecName: "utilities") pod "4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" (UID: "4a71db2a-2844-4b02-bba4-1de8f6a7f7eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.837857 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b766b\" (UniqueName: \"kubernetes.io/projected/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-kube-api-access-b766b\") pod \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.838621 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-catalog-content\") pod \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\" (UID: \"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb\") " Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.838809 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.850652 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-kube-api-access-b766b" (OuterVolumeSpecName: "kube-api-access-b766b") pod "4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" (UID: "4a71db2a-2844-4b02-bba4-1de8f6a7f7eb"). InnerVolumeSpecName "kube-api-access-b766b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:55:44 crc kubenswrapper[4786]: I0313 12:55:44.939808 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b766b\" (UniqueName: \"kubernetes.io/projected/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-kube-api-access-b766b\") on node \"crc\" DevicePath \"\"" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.002293 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" (UID: "4a71db2a-2844-4b02-bba4-1de8f6a7f7eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.040618 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.444699 4786 generic.go:334] "Generic (PLEG): container finished" podID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerID="d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817" exitCode=0 Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.444857 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fswm7" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.457972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerDied","Data":"d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817"} Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.458022 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswm7" event={"ID":"4a71db2a-2844-4b02-bba4-1de8f6a7f7eb","Type":"ContainerDied","Data":"bd645cf207a105e75b91689cce7ab3f9e960d4d5791a5838e00cbd2951151b25"} Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.458041 4786 scope.go:117] "RemoveContainer" containerID="d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.486511 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fswm7"] Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.489633 4786 scope.go:117] "RemoveContainer" containerID="cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.493473 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fswm7"] Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.654100 4786 scope.go:117] "RemoveContainer" containerID="1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.685197 4786 scope.go:117] "RemoveContainer" containerID="d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817" Mar 13 12:55:45 crc kubenswrapper[4786]: E0313 12:55:45.686088 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817\": container with ID starting with d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817 not found: ID does not exist" containerID="d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.686978 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817"} err="failed to get container status \"d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817\": rpc error: code = NotFound desc = could not find container \"d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817\": container with ID starting with d1b9851956df892101939ca7718aaf154c611bb4090a670f492d258c3d9b4817 not found: ID does not exist" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.687015 4786 scope.go:117] "RemoveContainer" containerID="cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc" Mar 13 12:55:45 crc kubenswrapper[4786]: E0313 12:55:45.687437 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc\": container with ID starting with cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc not found: ID does not exist" containerID="cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.687568 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc"} err="failed to get container status \"cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc\": rpc error: code = NotFound desc = could not find container \"cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc\": container with ID starting with cc7ef362d8a8234c0bfaa87f1b36ce0cb92bc7c0be270f5ace54078a5951adcc not found: ID does not exist" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.687668 4786 scope.go:117] "RemoveContainer" containerID="1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b" Mar 13 12:55:45 crc kubenswrapper[4786]: E0313 12:55:45.688368 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b\": container with ID starting with 1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b not found: ID does not exist" containerID="1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b" Mar 13 12:55:45 crc kubenswrapper[4786]: I0313 12:55:45.688393 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b"} err="failed to get container status \"1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b\": rpc error: code = NotFound desc = could not find container \"1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b\": container with ID starting with 1f00b254686e8352a01fa666831c91d12179a860dd84bb5b547eb0c98fc1880b not found: ID does not exist" Mar 13 12:55:47 crc kubenswrapper[4786]: I0313 12:55:47.451180 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" path="/var/lib/kubelet/pods/4a71db2a-2844-4b02-bba4-1de8f6a7f7eb/volumes" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.150389 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556776-44gm7"] Mar 13 12:56:00 crc kubenswrapper[4786]: E0313 12:56:00.151450 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="extract-utilities" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151475 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="extract-utilities" Mar 13 12:56:00 crc kubenswrapper[4786]: E0313 12:56:00.151499 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="registry-server" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151508 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="registry-server" Mar 13 12:56:00 crc kubenswrapper[4786]: E0313 12:56:00.151531 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="extract-content" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151539 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="extract-content" Mar 13 12:56:00 crc kubenswrapper[4786]: E0313 12:56:00.151556 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="extract-content" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151567 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="extract-content" Mar 13 12:56:00 crc kubenswrapper[4786]: E0313 12:56:00.151579 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="registry-server" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151588 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="registry-server" Mar 13 12:56:00 crc kubenswrapper[4786]: E0313 12:56:00.151604 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="extract-utilities" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151613 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="extract-utilities" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151783 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a71db2a-2844-4b02-bba4-1de8f6a7f7eb" containerName="registry-server" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.151818 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="edcbe8a0-ccb8-4fb2-9a57-118403205c9d" containerName="registry-server" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.152648 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.161413 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.161545 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.161580 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.163071 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-44gm7"] Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.351066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtn87\" (UniqueName: \"kubernetes.io/projected/ed7507e0-d80c-4851-b724-ec37229d2d45-kube-api-access-gtn87\") pod \"auto-csr-approver-29556776-44gm7\" (UID: \"ed7507e0-d80c-4851-b724-ec37229d2d45\") " pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.453538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtn87\" (UniqueName: \"kubernetes.io/projected/ed7507e0-d80c-4851-b724-ec37229d2d45-kube-api-access-gtn87\") pod \"auto-csr-approver-29556776-44gm7\" (UID: \"ed7507e0-d80c-4851-b724-ec37229d2d45\") " pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.479998 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtn87\" (UniqueName: \"kubernetes.io/projected/ed7507e0-d80c-4851-b724-ec37229d2d45-kube-api-access-gtn87\") pod \"auto-csr-approver-29556776-44gm7\" (UID: \"ed7507e0-d80c-4851-b724-ec37229d2d45\") " pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:00 crc kubenswrapper[4786]: I0313 12:56:00.772167 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:01 crc kubenswrapper[4786]: I0313 12:56:01.183913 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-44gm7"] Mar 13 12:56:01 crc kubenswrapper[4786]: I0313 12:56:01.568342 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-44gm7" event={"ID":"ed7507e0-d80c-4851-b724-ec37229d2d45","Type":"ContainerStarted","Data":"fbf000953d698b85b4113891215ee808346b12c185455a15106cec88862a1502"} Mar 13 12:56:02 crc kubenswrapper[4786]: I0313 12:56:02.577158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-44gm7" event={"ID":"ed7507e0-d80c-4851-b724-ec37229d2d45","Type":"ContainerStarted","Data":"47f26da5cda5456180c7c0f0c24391377edf99338244c50359859cf1be775419"} Mar 13 12:56:02 crc kubenswrapper[4786]: I0313 12:56:02.601671 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556776-44gm7" podStartSLOduration=1.484397038 podStartE2EDuration="2.601649966s" podCreationTimestamp="2026-03-13 12:56:00 +0000 UTC" firstStartedPulling="2026-03-13 12:56:01.194421032 +0000 UTC m=+4148.474074479" lastFinishedPulling="2026-03-13 12:56:02.31167396 +0000 UTC m=+4149.591327407" observedRunningTime="2026-03-13 12:56:02.595019454 +0000 UTC m=+4149.874672891" watchObservedRunningTime="2026-03-13 12:56:02.601649966 +0000 UTC m=+4149.881303413" Mar 13 12:56:03 crc kubenswrapper[4786]: I0313 12:56:03.587748 4786 generic.go:334] "Generic (PLEG): container finished" podID="ed7507e0-d80c-4851-b724-ec37229d2d45" containerID="47f26da5cda5456180c7c0f0c24391377edf99338244c50359859cf1be775419" exitCode=0 Mar 13 12:56:03 crc kubenswrapper[4786]: I0313 12:56:03.587797 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-44gm7" event={"ID":"ed7507e0-d80c-4851-b724-ec37229d2d45","Type":"ContainerDied","Data":"47f26da5cda5456180c7c0f0c24391377edf99338244c50359859cf1be775419"} Mar 13 12:56:04 crc kubenswrapper[4786]: I0313 12:56:04.864222 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.022630 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtn87\" (UniqueName: \"kubernetes.io/projected/ed7507e0-d80c-4851-b724-ec37229d2d45-kube-api-access-gtn87\") pod \"ed7507e0-d80c-4851-b724-ec37229d2d45\" (UID: \"ed7507e0-d80c-4851-b724-ec37229d2d45\") " Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.027258 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7507e0-d80c-4851-b724-ec37229d2d45-kube-api-access-gtn87" (OuterVolumeSpecName: "kube-api-access-gtn87") pod "ed7507e0-d80c-4851-b724-ec37229d2d45" (UID: "ed7507e0-d80c-4851-b724-ec37229d2d45"). InnerVolumeSpecName "kube-api-access-gtn87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.123929 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtn87\" (UniqueName: \"kubernetes.io/projected/ed7507e0-d80c-4851-b724-ec37229d2d45-kube-api-access-gtn87\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.601977 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-44gm7" event={"ID":"ed7507e0-d80c-4851-b724-ec37229d2d45","Type":"ContainerDied","Data":"fbf000953d698b85b4113891215ee808346b12c185455a15106cec88862a1502"} Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.602026 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbf000953d698b85b4113891215ee808346b12c185455a15106cec88862a1502" Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.602056 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-44gm7" Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.670382 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-zn8vk"] Mar 13 12:56:05 crc kubenswrapper[4786]: I0313 12:56:05.678811 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-zn8vk"] Mar 13 12:56:07 crc kubenswrapper[4786]: I0313 12:56:07.458388 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850104c6-c39d-4238-a99d-33127a1a23ca" path="/var/lib/kubelet/pods/850104c6-c39d-4238-a99d-33127a1a23ca/volumes" Mar 13 12:56:18 crc kubenswrapper[4786]: I0313 12:56:18.535659 4786 scope.go:117] "RemoveContainer" containerID="0fa9f9952b300d0cf12ddc0f68180a562ac406b26ddbcedc1aa0b6c9356f16f4" Mar 13 12:57:08 crc kubenswrapper[4786]: I0313 12:57:08.169568 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:57:08 crc kubenswrapper[4786]: I0313 12:57:08.170129 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:38 crc kubenswrapper[4786]: I0313 12:57:38.169825 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:57:38 crc kubenswrapper[4786]: I0313 12:57:38.170412 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.148246 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pfflz"] Mar 13 12:58:00 crc kubenswrapper[4786]: E0313 12:58:00.149841 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7507e0-d80c-4851-b724-ec37229d2d45" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.149859 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7507e0-d80c-4851-b724-ec37229d2d45" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.150069 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7507e0-d80c-4851-b724-ec37229d2d45" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.150739 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.154918 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.155093 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.155142 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.159352 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pfflz"] Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.266063 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmrrk\" (UniqueName: \"kubernetes.io/projected/6f06c307-3017-4181-b34d-60194499d5cf-kube-api-access-rmrrk\") pod \"auto-csr-approver-29556778-pfflz\" (UID: \"6f06c307-3017-4181-b34d-60194499d5cf\") " pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.367251 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmrrk\" (UniqueName: \"kubernetes.io/projected/6f06c307-3017-4181-b34d-60194499d5cf-kube-api-access-rmrrk\") pod \"auto-csr-approver-29556778-pfflz\" (UID: \"6f06c307-3017-4181-b34d-60194499d5cf\") " pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.388806 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmrrk\" (UniqueName: \"kubernetes.io/projected/6f06c307-3017-4181-b34d-60194499d5cf-kube-api-access-rmrrk\") pod \"auto-csr-approver-29556778-pfflz\" (UID: \"6f06c307-3017-4181-b34d-60194499d5cf\") " pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.478272 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:00 crc kubenswrapper[4786]: I0313 12:58:00.904658 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pfflz"] Mar 13 12:58:00 crc kubenswrapper[4786]: W0313 12:58:00.918191 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f06c307_3017_4181_b34d_60194499d5cf.slice/crio-b310b562b53b8b07a632ecf1ae877c16229cf7c1ceb7bd0452b99e6f2d9ade70 WatchSource:0}: Error finding container b310b562b53b8b07a632ecf1ae877c16229cf7c1ceb7bd0452b99e6f2d9ade70: Status 404 returned error can't find the container with id b310b562b53b8b07a632ecf1ae877c16229cf7c1ceb7bd0452b99e6f2d9ade70 Mar 13 12:58:01 crc kubenswrapper[4786]: I0313 12:58:01.434261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pfflz" event={"ID":"6f06c307-3017-4181-b34d-60194499d5cf","Type":"ContainerStarted","Data":"b310b562b53b8b07a632ecf1ae877c16229cf7c1ceb7bd0452b99e6f2d9ade70"} Mar 13 12:58:02 crc kubenswrapper[4786]: I0313 12:58:02.440974 4786 generic.go:334] "Generic (PLEG): container finished" podID="6f06c307-3017-4181-b34d-60194499d5cf" containerID="d658e948756e7764b17d6dc4547ec84f4cfda2c110a686c2776ad9a2c59b0f75" exitCode=0 Mar 13 12:58:02 crc kubenswrapper[4786]: I0313 12:58:02.441228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pfflz" event={"ID":"6f06c307-3017-4181-b34d-60194499d5cf","Type":"ContainerDied","Data":"d658e948756e7764b17d6dc4547ec84f4cfda2c110a686c2776ad9a2c59b0f75"} Mar 13 12:58:03 crc kubenswrapper[4786]: I0313 12:58:03.697457 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:03 crc kubenswrapper[4786]: I0313 12:58:03.817848 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmrrk\" (UniqueName: \"kubernetes.io/projected/6f06c307-3017-4181-b34d-60194499d5cf-kube-api-access-rmrrk\") pod \"6f06c307-3017-4181-b34d-60194499d5cf\" (UID: \"6f06c307-3017-4181-b34d-60194499d5cf\") " Mar 13 12:58:03 crc kubenswrapper[4786]: I0313 12:58:03.823661 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f06c307-3017-4181-b34d-60194499d5cf-kube-api-access-rmrrk" (OuterVolumeSpecName: "kube-api-access-rmrrk") pod "6f06c307-3017-4181-b34d-60194499d5cf" (UID: "6f06c307-3017-4181-b34d-60194499d5cf"). InnerVolumeSpecName "kube-api-access-rmrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:58:03 crc kubenswrapper[4786]: I0313 12:58:03.919477 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmrrk\" (UniqueName: \"kubernetes.io/projected/6f06c307-3017-4181-b34d-60194499d5cf-kube-api-access-rmrrk\") on node \"crc\" DevicePath \"\"" Mar 13 12:58:04 crc kubenswrapper[4786]: I0313 12:58:04.456743 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pfflz" event={"ID":"6f06c307-3017-4181-b34d-60194499d5cf","Type":"ContainerDied","Data":"b310b562b53b8b07a632ecf1ae877c16229cf7c1ceb7bd0452b99e6f2d9ade70"} Mar 13 12:58:04 crc kubenswrapper[4786]: I0313 12:58:04.456787 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b310b562b53b8b07a632ecf1ae877c16229cf7c1ceb7bd0452b99e6f2d9ade70" Mar 13 12:58:04 crc kubenswrapper[4786]: I0313 12:58:04.456792 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pfflz" Mar 13 12:58:04 crc kubenswrapper[4786]: I0313 12:58:04.761996 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-fsvwz"] Mar 13 12:58:04 crc kubenswrapper[4786]: I0313 12:58:04.766486 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-fsvwz"] Mar 13 12:58:05 crc kubenswrapper[4786]: I0313 12:58:05.456354 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d1d9e1-ca9a-492a-9146-27655ec1429b" path="/var/lib/kubelet/pods/84d1d9e1-ca9a-492a-9146-27655ec1429b/volumes" Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.169082 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.169675 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.169738 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.170478 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06f2398628635d205a73435e995f3165f17e4749a03d0c6f5e7bcd488c6712f1"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.170546 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://06f2398628635d205a73435e995f3165f17e4749a03d0c6f5e7bcd488c6712f1" gracePeriod=600 Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.488377 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="06f2398628635d205a73435e995f3165f17e4749a03d0c6f5e7bcd488c6712f1" exitCode=0 Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.488699 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"06f2398628635d205a73435e995f3165f17e4749a03d0c6f5e7bcd488c6712f1"} Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.488846 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab"} Mar 13 12:58:08 crc kubenswrapper[4786]: I0313 12:58:08.488913 4786 scope.go:117] "RemoveContainer" containerID="22a1ce61b6b87f00bc1079ed80ee398804b172ff02c112f25fe7843c0f2d7219" Mar 13 12:58:18 crc kubenswrapper[4786]: I0313 12:58:18.640569 4786 scope.go:117] "RemoveContainer" containerID="6dde950952d366ef971a3dbd7d2ac3f2f5f76ad65f238719752796352f33d8e6" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.862134 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wspkk"] Mar 13 12:59:44 crc kubenswrapper[4786]: E0313 12:59:44.863068 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f06c307-3017-4181-b34d-60194499d5cf" containerName="oc" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.863160 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f06c307-3017-4181-b34d-60194499d5cf" containerName="oc" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.863318 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f06c307-3017-4181-b34d-60194499d5cf" containerName="oc" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.864392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.880221 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wspkk"] Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.916998 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-utilities\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.917041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-catalog-content\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:44 crc kubenswrapper[4786]: I0313 12:59:44.917109 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcdd\" (UniqueName: \"kubernetes.io/projected/ec1550cf-afa8-409b-b48f-e89cca640798-kube-api-access-9qcdd\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.018272 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-utilities\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.018346 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-catalog-content\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.018954 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-utilities\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.018984 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-catalog-content\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.019057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcdd\" (UniqueName: \"kubernetes.io/projected/ec1550cf-afa8-409b-b48f-e89cca640798-kube-api-access-9qcdd\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.041236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcdd\" (UniqueName: \"kubernetes.io/projected/ec1550cf-afa8-409b-b48f-e89cca640798-kube-api-access-9qcdd\") pod \"redhat-marketplace-wspkk\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.193679 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:45 crc kubenswrapper[4786]: I0313 12:59:45.662106 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wspkk"] Mar 13 12:59:45 crc kubenswrapper[4786]: W0313 12:59:45.666935 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec1550cf_afa8_409b_b48f_e89cca640798.slice/crio-b07f8d0e788f4aea456f1542ec660c70eb675ef9b5de03d4d2fe66b8267d21ce WatchSource:0}: Error finding container b07f8d0e788f4aea456f1542ec660c70eb675ef9b5de03d4d2fe66b8267d21ce: Status 404 returned error can't find the container with id b07f8d0e788f4aea456f1542ec660c70eb675ef9b5de03d4d2fe66b8267d21ce Mar 13 12:59:46 crc kubenswrapper[4786]: I0313 12:59:46.216203 4786 generic.go:334] "Generic (PLEG): container finished" podID="ec1550cf-afa8-409b-b48f-e89cca640798" containerID="9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384" exitCode=0 Mar 13 12:59:46 crc kubenswrapper[4786]: I0313 12:59:46.216247 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerDied","Data":"9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384"} Mar 13 12:59:46 crc kubenswrapper[4786]: I0313 12:59:46.216272 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerStarted","Data":"b07f8d0e788f4aea456f1542ec660c70eb675ef9b5de03d4d2fe66b8267d21ce"} Mar 13 12:59:46 crc kubenswrapper[4786]: I0313 12:59:46.218903 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:59:47 crc kubenswrapper[4786]: I0313 12:59:47.224016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerStarted","Data":"64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337"} Mar 13 12:59:48 crc kubenswrapper[4786]: I0313 12:59:48.237731 4786 generic.go:334] "Generic (PLEG): container finished" podID="ec1550cf-afa8-409b-b48f-e89cca640798" containerID="64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337" exitCode=0 Mar 13 12:59:48 crc kubenswrapper[4786]: I0313 12:59:48.237823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerDied","Data":"64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337"} Mar 13 12:59:49 crc kubenswrapper[4786]: I0313 12:59:49.248152 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerStarted","Data":"8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52"} Mar 13 12:59:49 crc kubenswrapper[4786]: I0313 12:59:49.274724 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wspkk" podStartSLOduration=2.842861033 podStartE2EDuration="5.274686653s" podCreationTimestamp="2026-03-13 12:59:44 +0000 UTC" firstStartedPulling="2026-03-13 12:59:46.218603651 +0000 UTC m=+4373.498257098" lastFinishedPulling="2026-03-13 12:59:48.650429271 +0000 UTC m=+4375.930082718" observedRunningTime="2026-03-13 12:59:49.271340651 +0000 UTC m=+4376.550994108" watchObservedRunningTime="2026-03-13 12:59:49.274686653 +0000 UTC m=+4376.554340160" Mar 13 12:59:55 crc kubenswrapper[4786]: I0313 12:59:55.195152 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:55 crc kubenswrapper[4786]: I0313 12:59:55.195686 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:55 crc kubenswrapper[4786]: I0313 12:59:55.256040 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:55 crc kubenswrapper[4786]: I0313 12:59:55.348263 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:55 crc kubenswrapper[4786]: I0313 12:59:55.497234 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wspkk"] Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.325446 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wspkk" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="registry-server" containerID="cri-o://8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52" gracePeriod=2 Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.711055 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.911037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-utilities\") pod \"ec1550cf-afa8-409b-b48f-e89cca640798\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.911086 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qcdd\" (UniqueName: \"kubernetes.io/projected/ec1550cf-afa8-409b-b48f-e89cca640798-kube-api-access-9qcdd\") pod \"ec1550cf-afa8-409b-b48f-e89cca640798\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.911155 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-catalog-content\") pod \"ec1550cf-afa8-409b-b48f-e89cca640798\" (UID: \"ec1550cf-afa8-409b-b48f-e89cca640798\") " Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.911918 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-utilities" (OuterVolumeSpecName: "utilities") pod "ec1550cf-afa8-409b-b48f-e89cca640798" (UID: "ec1550cf-afa8-409b-b48f-e89cca640798"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.916722 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1550cf-afa8-409b-b48f-e89cca640798-kube-api-access-9qcdd" (OuterVolumeSpecName: "kube-api-access-9qcdd") pod "ec1550cf-afa8-409b-b48f-e89cca640798" (UID: "ec1550cf-afa8-409b-b48f-e89cca640798"). InnerVolumeSpecName "kube-api-access-9qcdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:59:57 crc kubenswrapper[4786]: I0313 12:59:57.947341 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec1550cf-afa8-409b-b48f-e89cca640798" (UID: "ec1550cf-afa8-409b-b48f-e89cca640798"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.012382 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.012420 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1550cf-afa8-409b-b48f-e89cca640798-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.012430 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qcdd\" (UniqueName: \"kubernetes.io/projected/ec1550cf-afa8-409b-b48f-e89cca640798-kube-api-access-9qcdd\") on node \"crc\" DevicePath \"\"" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.333931 4786 generic.go:334] "Generic (PLEG): container finished" podID="ec1550cf-afa8-409b-b48f-e89cca640798" containerID="8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52" exitCode=0 Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.334005 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wspkk" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.334021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerDied","Data":"8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52"} Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.334161 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wspkk" event={"ID":"ec1550cf-afa8-409b-b48f-e89cca640798","Type":"ContainerDied","Data":"b07f8d0e788f4aea456f1542ec660c70eb675ef9b5de03d4d2fe66b8267d21ce"} Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.334245 4786 scope.go:117] "RemoveContainer" containerID="8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.358483 4786 scope.go:117] "RemoveContainer" containerID="64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.371612 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wspkk"] Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.377926 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wspkk"] Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.378333 4786 scope.go:117] "RemoveContainer" containerID="9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.399428 4786 scope.go:117] "RemoveContainer" containerID="8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52" Mar 13 12:59:58 crc kubenswrapper[4786]: E0313 12:59:58.399778 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52\": container with ID starting with 8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52 not found: ID does not exist" containerID="8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.399817 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52"} err="failed to get container status \"8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52\": rpc error: code = NotFound desc = could not find container \"8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52\": container with ID starting with 8338aece0fff1b832bc859aec9b3df9fb504df64d5bd551131e42c6edcc9db52 not found: ID does not exist" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.399842 4786 scope.go:117] "RemoveContainer" containerID="64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337" Mar 13 12:59:58 crc kubenswrapper[4786]: E0313 12:59:58.400108 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337\": container with ID starting with 64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337 not found: ID does not exist" containerID="64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.400145 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337"} err="failed to get container status \"64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337\": rpc error: code = NotFound desc = could not find container \"64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337\": container with ID starting with 64ba63b2327f090d6b23739c1f493f2bed87392352d169f96d67cab6291ff337 not found: ID does not exist" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.400165 4786 scope.go:117] "RemoveContainer" containerID="9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384" Mar 13 12:59:58 crc kubenswrapper[4786]: E0313 12:59:58.400412 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384\": container with ID starting with 9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384 not found: ID does not exist" containerID="9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384" Mar 13 12:59:58 crc kubenswrapper[4786]: I0313 12:59:58.400451 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384"} err="failed to get container status \"9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384\": rpc error: code = NotFound desc = could not find container \"9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384\": container with ID starting with 9caf2292f1f6a28582ec0fdbeba4ed83e50f4960269abb4e2f571a7f7b72d384 not found: ID does not exist" Mar 13 12:59:59 crc kubenswrapper[4786]: I0313 12:59:59.451707 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" path="/var/lib/kubelet/pods/ec1550cf-afa8-409b-b48f-e89cca640798/volumes" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.154515 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t"] Mar 13 13:00:00 crc kubenswrapper[4786]: E0313 13:00:00.154993 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="extract-utilities" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.155010 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="extract-utilities" Mar 13 13:00:00 crc kubenswrapper[4786]: E0313 13:00:00.155021 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="extract-content" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.155030 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="extract-content" Mar 13 13:00:00 crc kubenswrapper[4786]: E0313 13:00:00.155050 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="registry-server" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.155060 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="registry-server" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.155209 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1550cf-afa8-409b-b48f-e89cca640798" containerName="registry-server" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.155778 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.157722 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.158344 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.166631 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556780-mvm68"] Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.173226 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t"] Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.173587 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.179749 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.180100 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.184240 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-mvm68"] Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.193293 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.346512 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspgv\" (UniqueName: \"kubernetes.io/projected/4fccf9fb-7515-4672-8286-33b95ee89998-kube-api-access-jspgv\") pod \"auto-csr-approver-29556780-mvm68\" (UID: \"4fccf9fb-7515-4672-8286-33b95ee89998\") " pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.346582 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52e1982-8b39-4a21-96d0-aaa8c90d8795-secret-volume\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.346611 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pkw\" (UniqueName: \"kubernetes.io/projected/f52e1982-8b39-4a21-96d0-aaa8c90d8795-kube-api-access-d5pkw\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.346639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52e1982-8b39-4a21-96d0-aaa8c90d8795-config-volume\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.448673 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jspgv\" (UniqueName: \"kubernetes.io/projected/4fccf9fb-7515-4672-8286-33b95ee89998-kube-api-access-jspgv\") pod \"auto-csr-approver-29556780-mvm68\" (UID: \"4fccf9fb-7515-4672-8286-33b95ee89998\") " pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.448796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52e1982-8b39-4a21-96d0-aaa8c90d8795-secret-volume\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.448843 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pkw\" (UniqueName: \"kubernetes.io/projected/f52e1982-8b39-4a21-96d0-aaa8c90d8795-kube-api-access-d5pkw\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.448872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52e1982-8b39-4a21-96d0-aaa8c90d8795-config-volume\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.450405 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52e1982-8b39-4a21-96d0-aaa8c90d8795-config-volume\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.456957 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52e1982-8b39-4a21-96d0-aaa8c90d8795-secret-volume\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.465639 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspgv\" (UniqueName: \"kubernetes.io/projected/4fccf9fb-7515-4672-8286-33b95ee89998-kube-api-access-jspgv\") pod \"auto-csr-approver-29556780-mvm68\" (UID: \"4fccf9fb-7515-4672-8286-33b95ee89998\") " pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.472769 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pkw\" (UniqueName: \"kubernetes.io/projected/f52e1982-8b39-4a21-96d0-aaa8c90d8795-kube-api-access-d5pkw\") pod \"collect-profiles-29556780-4vl8t\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.478044 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.500487 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.915130 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t"] Mar 13 13:00:00 crc kubenswrapper[4786]: I0313 13:00:00.974689 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-mvm68"] Mar 13 13:00:00 crc kubenswrapper[4786]: W0313 13:00:00.984989 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fccf9fb_7515_4672_8286_33b95ee89998.slice/crio-60e9c3696cfe88bcb70d43d12b5fb6adbfdb84a42edb26a96bab7eab217c7a18 WatchSource:0}: Error finding container 60e9c3696cfe88bcb70d43d12b5fb6adbfdb84a42edb26a96bab7eab217c7a18: Status 404 returned error can't find the container with id 60e9c3696cfe88bcb70d43d12b5fb6adbfdb84a42edb26a96bab7eab217c7a18 Mar 13 13:00:01 crc kubenswrapper[4786]: I0313 13:00:01.362384 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-mvm68" event={"ID":"4fccf9fb-7515-4672-8286-33b95ee89998","Type":"ContainerStarted","Data":"60e9c3696cfe88bcb70d43d12b5fb6adbfdb84a42edb26a96bab7eab217c7a18"} Mar 13 13:00:01 crc kubenswrapper[4786]: I0313 13:00:01.364296 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" event={"ID":"f52e1982-8b39-4a21-96d0-aaa8c90d8795","Type":"ContainerStarted","Data":"efd7684127293b210e9e2875d7377a1f16f0b3bfac5972b392aebc4572d7432f"} Mar 13 13:00:01 crc kubenswrapper[4786]: I0313 13:00:01.364337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" event={"ID":"f52e1982-8b39-4a21-96d0-aaa8c90d8795","Type":"ContainerStarted","Data":"8cef6e323841bddae7467de016156ec6d7a7207f77261e0969ba4f7d2a51db98"} Mar 13 13:00:01 crc kubenswrapper[4786]: I0313 13:00:01.381256 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" podStartSLOduration=1.38123515 podStartE2EDuration="1.38123515s" podCreationTimestamp="2026-03-13 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:00:01.3772019 +0000 UTC m=+4388.656855377" watchObservedRunningTime="2026-03-13 13:00:01.38123515 +0000 UTC m=+4388.660888597" Mar 13 13:00:02 crc kubenswrapper[4786]: I0313 13:00:02.371983 4786 generic.go:334] "Generic (PLEG): container finished" podID="f52e1982-8b39-4a21-96d0-aaa8c90d8795" containerID="efd7684127293b210e9e2875d7377a1f16f0b3bfac5972b392aebc4572d7432f" exitCode=0 Mar 13 13:00:02 crc kubenswrapper[4786]: I0313 13:00:02.372059 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" event={"ID":"f52e1982-8b39-4a21-96d0-aaa8c90d8795","Type":"ContainerDied","Data":"efd7684127293b210e9e2875d7377a1f16f0b3bfac5972b392aebc4572d7432f"} Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.635126 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.794760 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5pkw\" (UniqueName: \"kubernetes.io/projected/f52e1982-8b39-4a21-96d0-aaa8c90d8795-kube-api-access-d5pkw\") pod \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.794906 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52e1982-8b39-4a21-96d0-aaa8c90d8795-secret-volume\") pod \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.794945 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52e1982-8b39-4a21-96d0-aaa8c90d8795-config-volume\") pod \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\" (UID: \"f52e1982-8b39-4a21-96d0-aaa8c90d8795\") " Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.795719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52e1982-8b39-4a21-96d0-aaa8c90d8795-config-volume" (OuterVolumeSpecName: "config-volume") pod "f52e1982-8b39-4a21-96d0-aaa8c90d8795" (UID: "f52e1982-8b39-4a21-96d0-aaa8c90d8795"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.800427 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52e1982-8b39-4a21-96d0-aaa8c90d8795-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f52e1982-8b39-4a21-96d0-aaa8c90d8795" (UID: "f52e1982-8b39-4a21-96d0-aaa8c90d8795"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.801709 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52e1982-8b39-4a21-96d0-aaa8c90d8795-kube-api-access-d5pkw" (OuterVolumeSpecName: "kube-api-access-d5pkw") pod "f52e1982-8b39-4a21-96d0-aaa8c90d8795" (UID: "f52e1982-8b39-4a21-96d0-aaa8c90d8795"). InnerVolumeSpecName "kube-api-access-d5pkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.897063 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5pkw\" (UniqueName: \"kubernetes.io/projected/f52e1982-8b39-4a21-96d0-aaa8c90d8795-kube-api-access-d5pkw\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.897371 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52e1982-8b39-4a21-96d0-aaa8c90d8795-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:03 crc kubenswrapper[4786]: I0313 13:00:03.897385 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52e1982-8b39-4a21-96d0-aaa8c90d8795-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.388863 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.389282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-4vl8t" event={"ID":"f52e1982-8b39-4a21-96d0-aaa8c90d8795","Type":"ContainerDied","Data":"8cef6e323841bddae7467de016156ec6d7a7207f77261e0969ba4f7d2a51db98"} Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.389542 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cef6e323841bddae7467de016156ec6d7a7207f77261e0969ba4f7d2a51db98" Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.392041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-mvm68" event={"ID":"4fccf9fb-7515-4672-8286-33b95ee89998","Type":"ContainerStarted","Data":"9f42ed417edab316bc9fcd7dafe5dd394e8eb7949e359c1b9af92022c8284a88"} Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.410431 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556780-mvm68" podStartSLOduration=1.390781051 podStartE2EDuration="4.410405449s" podCreationTimestamp="2026-03-13 13:00:00 +0000 UTC" firstStartedPulling="2026-03-13 13:00:00.987955159 +0000 UTC m=+4388.267608606" lastFinishedPulling="2026-03-13 13:00:04.007579557 +0000 UTC m=+4391.287233004" observedRunningTime="2026-03-13 13:00:04.404850527 +0000 UTC m=+4391.684503984" watchObservedRunningTime="2026-03-13 13:00:04.410405449 +0000 UTC m=+4391.690058916" Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.470451 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf"] Mar 13 13:00:04 crc kubenswrapper[4786]: I0313 13:00:04.475309 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-r9fcf"] Mar 13 13:00:05 crc kubenswrapper[4786]: I0313 13:00:05.400613 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fccf9fb-7515-4672-8286-33b95ee89998" containerID="9f42ed417edab316bc9fcd7dafe5dd394e8eb7949e359c1b9af92022c8284a88" exitCode=0 Mar 13 13:00:05 crc kubenswrapper[4786]: I0313 13:00:05.400669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-mvm68" event={"ID":"4fccf9fb-7515-4672-8286-33b95ee89998","Type":"ContainerDied","Data":"9f42ed417edab316bc9fcd7dafe5dd394e8eb7949e359c1b9af92022c8284a88"} Mar 13 13:00:05 crc kubenswrapper[4786]: I0313 13:00:05.448866 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b" path="/var/lib/kubelet/pods/849c5e5c-f1b3-43ba-ab9d-30a02ad2d10b/volumes" Mar 13 13:00:06 crc kubenswrapper[4786]: I0313 13:00:06.658109 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:06 crc kubenswrapper[4786]: I0313 13:00:06.668449 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jspgv\" (UniqueName: \"kubernetes.io/projected/4fccf9fb-7515-4672-8286-33b95ee89998-kube-api-access-jspgv\") pod \"4fccf9fb-7515-4672-8286-33b95ee89998\" (UID: \"4fccf9fb-7515-4672-8286-33b95ee89998\") " Mar 13 13:00:06 crc kubenswrapper[4786]: I0313 13:00:06.707695 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fccf9fb-7515-4672-8286-33b95ee89998-kube-api-access-jspgv" (OuterVolumeSpecName: "kube-api-access-jspgv") pod "4fccf9fb-7515-4672-8286-33b95ee89998" (UID: "4fccf9fb-7515-4672-8286-33b95ee89998"). InnerVolumeSpecName "kube-api-access-jspgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:06 crc kubenswrapper[4786]: I0313 13:00:06.769760 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jspgv\" (UniqueName: \"kubernetes.io/projected/4fccf9fb-7515-4672-8286-33b95ee89998-kube-api-access-jspgv\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:07 crc kubenswrapper[4786]: I0313 13:00:07.413490 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-mvm68" event={"ID":"4fccf9fb-7515-4672-8286-33b95ee89998","Type":"ContainerDied","Data":"60e9c3696cfe88bcb70d43d12b5fb6adbfdb84a42edb26a96bab7eab217c7a18"} Mar 13 13:00:07 crc kubenswrapper[4786]: I0313 13:00:07.413546 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60e9c3696cfe88bcb70d43d12b5fb6adbfdb84a42edb26a96bab7eab217c7a18" Mar 13 13:00:07 crc kubenswrapper[4786]: I0313 13:00:07.413620 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-mvm68" Mar 13 13:00:07 crc kubenswrapper[4786]: I0313 13:00:07.714304 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-lqx4p"] Mar 13 13:00:07 crc kubenswrapper[4786]: I0313 13:00:07.719008 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-lqx4p"] Mar 13 13:00:08 crc kubenswrapper[4786]: I0313 13:00:08.168784 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:00:08 crc kubenswrapper[4786]: I0313 13:00:08.168853 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:00:09 crc kubenswrapper[4786]: I0313 13:00:09.455207 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a72201b-4152-4903-b5a1-65c8905f77a4" path="/var/lib/kubelet/pods/7a72201b-4152-4903-b5a1-65c8905f77a4/volumes" Mar 13 13:00:18 crc kubenswrapper[4786]: I0313 13:00:18.723360 4786 scope.go:117] "RemoveContainer" containerID="a16ed3cb684e5824f6ff5eb7374493379ab860c7cba8b191bc760603b3322ffb" Mar 13 13:00:18 crc kubenswrapper[4786]: I0313 13:00:18.779538 4786 scope.go:117] "RemoveContainer" containerID="0cb12496b8a6706cffc3915bd459e1f457e1f4388d632e87469bbb5ef8a2b453" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.221666 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l2kj7"] Mar 13 13:00:31 crc kubenswrapper[4786]: E0313 13:00:31.223148 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fccf9fb-7515-4672-8286-33b95ee89998" containerName="oc" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.223195 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fccf9fb-7515-4672-8286-33b95ee89998" containerName="oc" Mar 13 13:00:31 crc kubenswrapper[4786]: E0313 13:00:31.223268 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52e1982-8b39-4a21-96d0-aaa8c90d8795" containerName="collect-profiles" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.223287 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52e1982-8b39-4a21-96d0-aaa8c90d8795" containerName="collect-profiles" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.223671 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fccf9fb-7515-4672-8286-33b95ee89998" containerName="oc" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.223723 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52e1982-8b39-4a21-96d0-aaa8c90d8795" containerName="collect-profiles" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.226176 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.229866 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2kj7"] Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.351900 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-utilities\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.352446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnhl\" (UniqueName: \"kubernetes.io/projected/e4cc3a52-da49-43de-afbf-6de89927d28d-kube-api-access-zhnhl\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.352502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-catalog-content\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.454160 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnhl\" (UniqueName: \"kubernetes.io/projected/e4cc3a52-da49-43de-afbf-6de89927d28d-kube-api-access-zhnhl\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.454206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-catalog-content\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.454279 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-utilities\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.454831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-catalog-content\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.454852 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-utilities\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.474572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnhl\" (UniqueName: \"kubernetes.io/projected/e4cc3a52-da49-43de-afbf-6de89927d28d-kube-api-access-zhnhl\") pod \"certified-operators-l2kj7\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.551726 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:31 crc kubenswrapper[4786]: I0313 13:00:31.824280 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2kj7"] Mar 13 13:00:32 crc kubenswrapper[4786]: I0313 13:00:32.614313 4786 generic.go:334] "Generic (PLEG): container finished" podID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerID="b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59" exitCode=0 Mar 13 13:00:32 crc kubenswrapper[4786]: I0313 13:00:32.614398 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2kj7" event={"ID":"e4cc3a52-da49-43de-afbf-6de89927d28d","Type":"ContainerDied","Data":"b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59"} Mar 13 13:00:32 crc kubenswrapper[4786]: I0313 13:00:32.614655 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2kj7" event={"ID":"e4cc3a52-da49-43de-afbf-6de89927d28d","Type":"ContainerStarted","Data":"d214ae09b97418df331c16c260592dc8099c48c30ac8b45a4f629a6de628a37d"} Mar 13 13:00:34 crc kubenswrapper[4786]: I0313 13:00:34.634037 4786 generic.go:334] "Generic (PLEG): container finished" podID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerID="42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f" exitCode=0 Mar 13 13:00:34 crc kubenswrapper[4786]: I0313 13:00:34.634086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2kj7" event={"ID":"e4cc3a52-da49-43de-afbf-6de89927d28d","Type":"ContainerDied","Data":"42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f"} Mar 13 13:00:35 crc kubenswrapper[4786]: I0313 13:00:35.650396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2kj7" event={"ID":"e4cc3a52-da49-43de-afbf-6de89927d28d","Type":"ContainerStarted","Data":"6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4"} Mar 13 13:00:35 crc kubenswrapper[4786]: I0313 13:00:35.674220 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l2kj7" podStartSLOduration=2.247070311 podStartE2EDuration="4.674201154s" podCreationTimestamp="2026-03-13 13:00:31 +0000 UTC" firstStartedPulling="2026-03-13 13:00:32.616636462 +0000 UTC m=+4419.896289909" lastFinishedPulling="2026-03-13 13:00:35.043767295 +0000 UTC m=+4422.323420752" observedRunningTime="2026-03-13 13:00:35.672511068 +0000 UTC m=+4422.952164565" watchObservedRunningTime="2026-03-13 13:00:35.674201154 +0000 UTC m=+4422.953854621" Mar 13 13:00:38 crc kubenswrapper[4786]: I0313 13:00:38.169026 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:00:38 crc kubenswrapper[4786]: I0313 13:00:38.169110 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:00:41 crc kubenswrapper[4786]: I0313 13:00:41.552283 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:41 crc kubenswrapper[4786]: I0313 13:00:41.552835 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:41 crc kubenswrapper[4786]: I0313 13:00:41.615335 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:41 crc kubenswrapper[4786]: I0313 13:00:41.735453 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:41 crc kubenswrapper[4786]: I0313 13:00:41.848703 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2kj7"] Mar 13 13:00:43 crc kubenswrapper[4786]: I0313 13:00:43.708916 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l2kj7" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="registry-server" containerID="cri-o://6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4" gracePeriod=2 Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.110823 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.252658 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-catalog-content\") pod \"e4cc3a52-da49-43de-afbf-6de89927d28d\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.252866 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhnhl\" (UniqueName: \"kubernetes.io/projected/e4cc3a52-da49-43de-afbf-6de89927d28d-kube-api-access-zhnhl\") pod \"e4cc3a52-da49-43de-afbf-6de89927d28d\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.253032 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-utilities\") pod \"e4cc3a52-da49-43de-afbf-6de89927d28d\" (UID: \"e4cc3a52-da49-43de-afbf-6de89927d28d\") " Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.253728 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-utilities" (OuterVolumeSpecName: "utilities") pod "e4cc3a52-da49-43de-afbf-6de89927d28d" (UID: "e4cc3a52-da49-43de-afbf-6de89927d28d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.260166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cc3a52-da49-43de-afbf-6de89927d28d-kube-api-access-zhnhl" (OuterVolumeSpecName: "kube-api-access-zhnhl") pod "e4cc3a52-da49-43de-afbf-6de89927d28d" (UID: "e4cc3a52-da49-43de-afbf-6de89927d28d"). InnerVolumeSpecName "kube-api-access-zhnhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.317054 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4cc3a52-da49-43de-afbf-6de89927d28d" (UID: "e4cc3a52-da49-43de-afbf-6de89927d28d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.354473 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.354540 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cc3a52-da49-43de-afbf-6de89927d28d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.354555 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhnhl\" (UniqueName: \"kubernetes.io/projected/e4cc3a52-da49-43de-afbf-6de89927d28d-kube-api-access-zhnhl\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.721090 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2kj7" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.721112 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2kj7" event={"ID":"e4cc3a52-da49-43de-afbf-6de89927d28d","Type":"ContainerDied","Data":"6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4"} Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.721281 4786 scope.go:117] "RemoveContainer" containerID="6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.721021 4786 generic.go:334] "Generic (PLEG): container finished" podID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerID="6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4" exitCode=0 Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.721447 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2kj7" event={"ID":"e4cc3a52-da49-43de-afbf-6de89927d28d","Type":"ContainerDied","Data":"d214ae09b97418df331c16c260592dc8099c48c30ac8b45a4f629a6de628a37d"} Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.756107 4786 scope.go:117] "RemoveContainer" containerID="42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.762855 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2kj7"] Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.769145 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l2kj7"] Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.782097 4786 scope.go:117] "RemoveContainer" containerID="b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.807155 4786 scope.go:117] "RemoveContainer" containerID="6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4" Mar 13 13:00:44 crc kubenswrapper[4786]: E0313 13:00:44.807575 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4\": container with ID starting with 6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4 not found: ID does not exist" containerID="6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.807618 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4"} err="failed to get container status \"6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4\": rpc error: code = NotFound desc = could not find container \"6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4\": container with ID starting with 6e5b9fc6eaba29f050fc378dea536251b77c5dbbadcc5e8c6fa86c02bcbe7dc4 not found: ID does not exist" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.807644 4786 scope.go:117] "RemoveContainer" containerID="42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f" Mar 13 13:00:44 crc kubenswrapper[4786]: E0313 13:00:44.808020 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f\": container with ID starting with 42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f not found: ID does not exist" containerID="42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.808059 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f"} err="failed to get container status \"42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f\": rpc error: code = NotFound desc = could not find container \"42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f\": container with ID starting with 42555cb57136d2c9983e32cff41bee85170c647ff41b477cc5419578e9ac895f not found: ID does not exist" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.808089 4786 scope.go:117] "RemoveContainer" containerID="b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59" Mar 13 13:00:44 crc kubenswrapper[4786]: E0313 13:00:44.808351 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59\": container with ID starting with b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59 not found: ID does not exist" containerID="b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59" Mar 13 13:00:44 crc kubenswrapper[4786]: I0313 13:00:44.808376 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59"} err="failed to get container status \"b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59\": rpc error: code = NotFound desc = could not find container \"b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59\": container with ID starting with b926978e0908138f4582ace062f2b6272794dd4f19a1a2668ee260b72af0bb59 not found: ID does not exist" Mar 13 13:00:45 crc kubenswrapper[4786]: I0313 13:00:45.463304 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" path="/var/lib/kubelet/pods/e4cc3a52-da49-43de-afbf-6de89927d28d/volumes" Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.169336 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.169933 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.169995 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.170846 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.170963 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" gracePeriod=600 Mar 13 13:01:08 crc kubenswrapper[4786]: E0313 13:01:08.289360 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.919640 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" exitCode=0 Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.919694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab"} Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.919734 4786 scope.go:117] "RemoveContainer" containerID="06f2398628635d205a73435e995f3165f17e4749a03d0c6f5e7bcd488c6712f1" Mar 13 13:01:08 crc kubenswrapper[4786]: I0313 13:01:08.920287 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:01:08 crc kubenswrapper[4786]: E0313 13:01:08.920621 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:01:24 crc kubenswrapper[4786]: I0313 13:01:24.440749 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:01:24 crc kubenswrapper[4786]: E0313 13:01:24.441363 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:01:35 crc kubenswrapper[4786]: I0313 13:01:35.441725 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:01:35 crc kubenswrapper[4786]: E0313 13:01:35.442650 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:01:46 crc kubenswrapper[4786]: I0313 13:01:46.440329 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:01:46 crc kubenswrapper[4786]: E0313 13:01:46.441123 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:01:59 crc kubenswrapper[4786]: I0313 13:01:59.440662 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:01:59 crc kubenswrapper[4786]: E0313 13:01:59.441376 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.141758 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556782-hdfbf"] Mar 13 13:02:00 crc kubenswrapper[4786]: E0313 13:02:00.142137 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="extract-utilities" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.142150 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="extract-utilities" Mar 13 13:02:00 crc kubenswrapper[4786]: E0313 13:02:00.142169 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="registry-server" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.142174 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="registry-server" Mar 13 13:02:00 crc kubenswrapper[4786]: E0313 13:02:00.142199 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="extract-content" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.142208 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="extract-content" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.142367 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cc3a52-da49-43de-afbf-6de89927d28d" containerName="registry-server" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.142862 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.149815 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556782-hdfbf"] Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.150460 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.150860 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.152262 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.291115 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsw7m\" (UniqueName: \"kubernetes.io/projected/10371942-9d00-4050-9660-0d53bd08e6b7-kube-api-access-lsw7m\") pod \"auto-csr-approver-29556782-hdfbf\" (UID: \"10371942-9d00-4050-9660-0d53bd08e6b7\") " pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.394868 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsw7m\" (UniqueName: \"kubernetes.io/projected/10371942-9d00-4050-9660-0d53bd08e6b7-kube-api-access-lsw7m\") pod \"auto-csr-approver-29556782-hdfbf\" (UID: \"10371942-9d00-4050-9660-0d53bd08e6b7\") " pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.421650 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsw7m\" (UniqueName: \"kubernetes.io/projected/10371942-9d00-4050-9660-0d53bd08e6b7-kube-api-access-lsw7m\") pod \"auto-csr-approver-29556782-hdfbf\" (UID: \"10371942-9d00-4050-9660-0d53bd08e6b7\") " pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.459593 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:00 crc kubenswrapper[4786]: I0313 13:02:00.935304 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556782-hdfbf"] Mar 13 13:02:01 crc kubenswrapper[4786]: I0313 13:02:01.317590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" event={"ID":"10371942-9d00-4050-9660-0d53bd08e6b7","Type":"ContainerStarted","Data":"a4a694789a41361441442ff010e7d4caba12e3b4a685ddc652b6323a3ab9af25"} Mar 13 13:02:02 crc kubenswrapper[4786]: I0313 13:02:02.326410 4786 generic.go:334] "Generic (PLEG): container finished" podID="10371942-9d00-4050-9660-0d53bd08e6b7" containerID="182377278b7c09698facfc29bec1b26c273c42fb9f13ed0903a3fe5a6d8e4d11" exitCode=0 Mar 13 13:02:02 crc kubenswrapper[4786]: I0313 13:02:02.326469 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" event={"ID":"10371942-9d00-4050-9660-0d53bd08e6b7","Type":"ContainerDied","Data":"182377278b7c09698facfc29bec1b26c273c42fb9f13ed0903a3fe5a6d8e4d11"} Mar 13 13:02:03 crc kubenswrapper[4786]: I0313 13:02:03.647864 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:03 crc kubenswrapper[4786]: I0313 13:02:03.741605 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsw7m\" (UniqueName: \"kubernetes.io/projected/10371942-9d00-4050-9660-0d53bd08e6b7-kube-api-access-lsw7m\") pod \"10371942-9d00-4050-9660-0d53bd08e6b7\" (UID: \"10371942-9d00-4050-9660-0d53bd08e6b7\") " Mar 13 13:02:03 crc kubenswrapper[4786]: I0313 13:02:03.746067 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10371942-9d00-4050-9660-0d53bd08e6b7-kube-api-access-lsw7m" (OuterVolumeSpecName: "kube-api-access-lsw7m") pod "10371942-9d00-4050-9660-0d53bd08e6b7" (UID: "10371942-9d00-4050-9660-0d53bd08e6b7"). InnerVolumeSpecName "kube-api-access-lsw7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:02:03 crc kubenswrapper[4786]: I0313 13:02:03.844030 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsw7m\" (UniqueName: \"kubernetes.io/projected/10371942-9d00-4050-9660-0d53bd08e6b7-kube-api-access-lsw7m\") on node \"crc\" DevicePath \"\"" Mar 13 13:02:04 crc kubenswrapper[4786]: I0313 13:02:04.341870 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" event={"ID":"10371942-9d00-4050-9660-0d53bd08e6b7","Type":"ContainerDied","Data":"a4a694789a41361441442ff010e7d4caba12e3b4a685ddc652b6323a3ab9af25"} Mar 13 13:02:04 crc kubenswrapper[4786]: I0313 13:02:04.341940 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a694789a41361441442ff010e7d4caba12e3b4a685ddc652b6323a3ab9af25" Mar 13 13:02:04 crc kubenswrapper[4786]: I0313 13:02:04.341975 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556782-hdfbf" Mar 13 13:02:04 crc kubenswrapper[4786]: I0313 13:02:04.713369 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-44gm7"] Mar 13 13:02:04 crc kubenswrapper[4786]: I0313 13:02:04.718379 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-44gm7"] Mar 13 13:02:05 crc kubenswrapper[4786]: I0313 13:02:05.460068 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7507e0-d80c-4851-b724-ec37229d2d45" path="/var/lib/kubelet/pods/ed7507e0-d80c-4851-b724-ec37229d2d45/volumes" Mar 13 13:02:12 crc kubenswrapper[4786]: I0313 13:02:12.440152 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:02:12 crc kubenswrapper[4786]: E0313 13:02:12.440902 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:02:18 crc kubenswrapper[4786]: I0313 13:02:18.865529 4786 scope.go:117] "RemoveContainer" containerID="47f26da5cda5456180c7c0f0c24391377edf99338244c50359859cf1be775419" Mar 13 13:02:26 crc kubenswrapper[4786]: I0313 13:02:26.440815 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:02:26 crc kubenswrapper[4786]: E0313 13:02:26.441998 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:02:39 crc kubenswrapper[4786]: I0313 13:02:39.441513 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:02:39 crc kubenswrapper[4786]: E0313 13:02:39.442359 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:02:51 crc kubenswrapper[4786]: I0313 13:02:51.440574 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:02:51 crc kubenswrapper[4786]: E0313 13:02:51.441527 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:03:02 crc kubenswrapper[4786]: I0313 13:03:02.441312 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:03:02 crc kubenswrapper[4786]: E0313 13:03:02.442047 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:03:14 crc kubenswrapper[4786]: I0313 13:03:14.440994 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:03:14 crc kubenswrapper[4786]: E0313 13:03:14.441691 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:03:28 crc kubenswrapper[4786]: I0313 13:03:28.440601 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:03:28 crc kubenswrapper[4786]: E0313 13:03:28.441313 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:03:43 crc kubenswrapper[4786]: I0313 13:03:43.443725 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:03:43 crc kubenswrapper[4786]: E0313 13:03:43.444335 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:03:56 crc kubenswrapper[4786]: I0313 13:03:56.440989 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:03:56 crc kubenswrapper[4786]: E0313 13:03:56.441849 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.139676 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556784-77t49"] Mar 13 13:04:00 crc kubenswrapper[4786]: E0313 13:04:00.140353 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10371942-9d00-4050-9660-0d53bd08e6b7" containerName="oc" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.140368 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="10371942-9d00-4050-9660-0d53bd08e6b7" containerName="oc" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.140538 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="10371942-9d00-4050-9660-0d53bd08e6b7" containerName="oc" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.141134 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.144793 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.144828 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.145311 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.157520 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556784-77t49"] Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.191841 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnr8f\" (UniqueName: \"kubernetes.io/projected/0c261bdd-397b-4b84-83a9-36c4e1800eb3-kube-api-access-jnr8f\") pod \"auto-csr-approver-29556784-77t49\" (UID: \"0c261bdd-397b-4b84-83a9-36c4e1800eb3\") " pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.293746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnr8f\" (UniqueName: \"kubernetes.io/projected/0c261bdd-397b-4b84-83a9-36c4e1800eb3-kube-api-access-jnr8f\") pod \"auto-csr-approver-29556784-77t49\" (UID: \"0c261bdd-397b-4b84-83a9-36c4e1800eb3\") " pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.312183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnr8f\" (UniqueName: \"kubernetes.io/projected/0c261bdd-397b-4b84-83a9-36c4e1800eb3-kube-api-access-jnr8f\") pod \"auto-csr-approver-29556784-77t49\" (UID: \"0c261bdd-397b-4b84-83a9-36c4e1800eb3\") " pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.462012 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:00 crc kubenswrapper[4786]: I0313 13:04:00.881162 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556784-77t49"] Mar 13 13:04:01 crc kubenswrapper[4786]: I0313 13:04:01.150517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556784-77t49" event={"ID":"0c261bdd-397b-4b84-83a9-36c4e1800eb3","Type":"ContainerStarted","Data":"d035f015e0a9911f7eb4813839b0a7ecf3d2d6b20356d571272e869d8a3e42db"} Mar 13 13:04:03 crc kubenswrapper[4786]: I0313 13:04:03.168648 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c261bdd-397b-4b84-83a9-36c4e1800eb3" containerID="5a9bf9a4b6ba3a94305b9a1903db42206c11eec14be819923256e0849a6a50db" exitCode=0 Mar 13 13:04:03 crc kubenswrapper[4786]: I0313 13:04:03.168988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556784-77t49" event={"ID":"0c261bdd-397b-4b84-83a9-36c4e1800eb3","Type":"ContainerDied","Data":"5a9bf9a4b6ba3a94305b9a1903db42206c11eec14be819923256e0849a6a50db"} Mar 13 13:04:04 crc kubenswrapper[4786]: I0313 13:04:04.606733 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:04 crc kubenswrapper[4786]: I0313 13:04:04.656536 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnr8f\" (UniqueName: \"kubernetes.io/projected/0c261bdd-397b-4b84-83a9-36c4e1800eb3-kube-api-access-jnr8f\") pod \"0c261bdd-397b-4b84-83a9-36c4e1800eb3\" (UID: \"0c261bdd-397b-4b84-83a9-36c4e1800eb3\") " Mar 13 13:04:04 crc kubenswrapper[4786]: I0313 13:04:04.664920 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c261bdd-397b-4b84-83a9-36c4e1800eb3-kube-api-access-jnr8f" (OuterVolumeSpecName: "kube-api-access-jnr8f") pod "0c261bdd-397b-4b84-83a9-36c4e1800eb3" (UID: "0c261bdd-397b-4b84-83a9-36c4e1800eb3"). InnerVolumeSpecName "kube-api-access-jnr8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:04:04 crc kubenswrapper[4786]: I0313 13:04:04.758000 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnr8f\" (UniqueName: \"kubernetes.io/projected/0c261bdd-397b-4b84-83a9-36c4e1800eb3-kube-api-access-jnr8f\") on node \"crc\" DevicePath \"\"" Mar 13 13:04:05 crc kubenswrapper[4786]: I0313 13:04:05.183136 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556784-77t49" event={"ID":"0c261bdd-397b-4b84-83a9-36c4e1800eb3","Type":"ContainerDied","Data":"d035f015e0a9911f7eb4813839b0a7ecf3d2d6b20356d571272e869d8a3e42db"} Mar 13 13:04:05 crc kubenswrapper[4786]: I0313 13:04:05.183188 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d035f015e0a9911f7eb4813839b0a7ecf3d2d6b20356d571272e869d8a3e42db" Mar 13 13:04:05 crc kubenswrapper[4786]: I0313 13:04:05.183253 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556784-77t49" Mar 13 13:04:05 crc kubenswrapper[4786]: I0313 13:04:05.685990 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pfflz"] Mar 13 13:04:05 crc kubenswrapper[4786]: I0313 13:04:05.694707 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pfflz"] Mar 13 13:04:07 crc kubenswrapper[4786]: I0313 13:04:07.456576 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f06c307-3017-4181-b34d-60194499d5cf" path="/var/lib/kubelet/pods/6f06c307-3017-4181-b34d-60194499d5cf/volumes" Mar 13 13:04:10 crc kubenswrapper[4786]: I0313 13:04:10.440371 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:04:10 crc kubenswrapper[4786]: E0313 13:04:10.441088 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:04:18 crc kubenswrapper[4786]: I0313 13:04:18.944841 4786 scope.go:117] "RemoveContainer" containerID="d658e948756e7764b17d6dc4547ec84f4cfda2c110a686c2776ad9a2c59b0f75" Mar 13 13:04:22 crc kubenswrapper[4786]: I0313 13:04:22.441403 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:04:22 crc kubenswrapper[4786]: E0313 13:04:22.442162 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:04:37 crc kubenswrapper[4786]: I0313 13:04:37.441069 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:04:37 crc kubenswrapper[4786]: E0313 13:04:37.441969 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:04:50 crc kubenswrapper[4786]: I0313 13:04:50.440471 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:04:50 crc kubenswrapper[4786]: E0313 13:04:50.443294 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:05:02 crc kubenswrapper[4786]: I0313 13:05:02.440439 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:05:02 crc kubenswrapper[4786]: E0313 13:05:02.441262 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:05:14 crc kubenswrapper[4786]: I0313 13:05:14.440178 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:05:14 crc kubenswrapper[4786]: E0313 13:05:14.440869 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:05:26 crc kubenswrapper[4786]: I0313 13:05:26.440651 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:05:26 crc kubenswrapper[4786]: E0313 13:05:26.441437 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.855790 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pd5br/must-gather-zk95r"] Mar 13 13:05:37 crc kubenswrapper[4786]: E0313 13:05:37.856748 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c261bdd-397b-4b84-83a9-36c4e1800eb3" containerName="oc" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.856763 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c261bdd-397b-4b84-83a9-36c4e1800eb3" containerName="oc" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.856936 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c261bdd-397b-4b84-83a9-36c4e1800eb3" containerName="oc" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.857854 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.860745 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pd5br"/"default-dockercfg-w7855" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.860973 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pd5br"/"kube-root-ca.crt" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.861097 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pd5br"/"openshift-service-ca.crt" Mar 13 13:05:37 crc kubenswrapper[4786]: I0313 13:05:37.867460 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pd5br/must-gather-zk95r"] Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.004453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97hgs\" (UniqueName: \"kubernetes.io/projected/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-kube-api-access-97hgs\") pod \"must-gather-zk95r\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.004492 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-must-gather-output\") pod \"must-gather-zk95r\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.105777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97hgs\" (UniqueName: \"kubernetes.io/projected/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-kube-api-access-97hgs\") pod \"must-gather-zk95r\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.105827 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-must-gather-output\") pod \"must-gather-zk95r\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.106298 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-must-gather-output\") pod \"must-gather-zk95r\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.133248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97hgs\" (UniqueName: \"kubernetes.io/projected/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-kube-api-access-97hgs\") pod \"must-gather-zk95r\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.177742 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.440954 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:05:38 crc kubenswrapper[4786]: E0313 13:05:38.442045 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.618615 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pd5br/must-gather-zk95r"] Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.634080 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 13:05:38 crc kubenswrapper[4786]: I0313 13:05:38.889008 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pd5br/must-gather-zk95r" event={"ID":"c9c178b0-a334-4f29-a5c5-c7d86e67f16b","Type":"ContainerStarted","Data":"196f882eec007a1953988c1404efc028fb0547ea76ed1635b7c95325a1a774c2"} Mar 13 13:05:44 crc kubenswrapper[4786]: I0313 13:05:44.938759 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pd5br/must-gather-zk95r" event={"ID":"c9c178b0-a334-4f29-a5c5-c7d86e67f16b","Type":"ContainerStarted","Data":"6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277"} Mar 13 13:05:45 crc kubenswrapper[4786]: I0313 13:05:45.945517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pd5br/must-gather-zk95r" event={"ID":"c9c178b0-a334-4f29-a5c5-c7d86e67f16b","Type":"ContainerStarted","Data":"a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d"} Mar 13 13:05:45 crc kubenswrapper[4786]: I0313 13:05:45.962460 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pd5br/must-gather-zk95r" podStartSLOduration=3.034710903 podStartE2EDuration="8.962439682s" podCreationTimestamp="2026-03-13 13:05:37 +0000 UTC" firstStartedPulling="2026-03-13 13:05:38.633788737 +0000 UTC m=+4725.913442184" lastFinishedPulling="2026-03-13 13:05:44.561517516 +0000 UTC m=+4731.841170963" observedRunningTime="2026-03-13 13:05:45.958652249 +0000 UTC m=+4733.238305696" watchObservedRunningTime="2026-03-13 13:05:45.962439682 +0000 UTC m=+4733.242093129" Mar 13 13:05:51 crc kubenswrapper[4786]: I0313 13:05:51.440441 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:05:51 crc kubenswrapper[4786]: E0313 13:05:51.441113 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.139800 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556786-zlxg2"] Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.141557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.144591 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.144797 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.149607 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.150580 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8qs\" (UniqueName: \"kubernetes.io/projected/1713d69b-5497-4c42-a01c-bd6dbad8dd32-kube-api-access-pm8qs\") pod \"auto-csr-approver-29556786-zlxg2\" (UID: \"1713d69b-5497-4c42-a01c-bd6dbad8dd32\") " pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.157638 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556786-zlxg2"] Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.251910 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8qs\" (UniqueName: \"kubernetes.io/projected/1713d69b-5497-4c42-a01c-bd6dbad8dd32-kube-api-access-pm8qs\") pod \"auto-csr-approver-29556786-zlxg2\" (UID: \"1713d69b-5497-4c42-a01c-bd6dbad8dd32\") " pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.270248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8qs\" (UniqueName: \"kubernetes.io/projected/1713d69b-5497-4c42-a01c-bd6dbad8dd32-kube-api-access-pm8qs\") pod \"auto-csr-approver-29556786-zlxg2\" (UID: \"1713d69b-5497-4c42-a01c-bd6dbad8dd32\") " pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.463504 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:00 crc kubenswrapper[4786]: I0313 13:06:00.896180 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556786-zlxg2"] Mar 13 13:06:00 crc kubenswrapper[4786]: W0313 13:06:00.902425 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1713d69b_5497_4c42_a01c_bd6dbad8dd32.slice/crio-45eeae48e748d8cf25676a84fc6ef34ae25632a029e29d2c783015d81b881fa3 WatchSource:0}: Error finding container 45eeae48e748d8cf25676a84fc6ef34ae25632a029e29d2c783015d81b881fa3: Status 404 returned error can't find the container with id 45eeae48e748d8cf25676a84fc6ef34ae25632a029e29d2c783015d81b881fa3 Mar 13 13:06:01 crc kubenswrapper[4786]: I0313 13:06:01.037153 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" event={"ID":"1713d69b-5497-4c42-a01c-bd6dbad8dd32","Type":"ContainerStarted","Data":"45eeae48e748d8cf25676a84fc6ef34ae25632a029e29d2c783015d81b881fa3"} Mar 13 13:06:03 crc kubenswrapper[4786]: I0313 13:06:03.052286 4786 generic.go:334] "Generic (PLEG): container finished" podID="1713d69b-5497-4c42-a01c-bd6dbad8dd32" containerID="de6cb97b29ab8bd03a27ff16f01d478496400661617023ab64799ace81c7244d" exitCode=0 Mar 13 13:06:03 crc kubenswrapper[4786]: I0313 13:06:03.052386 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" event={"ID":"1713d69b-5497-4c42-a01c-bd6dbad8dd32","Type":"ContainerDied","Data":"de6cb97b29ab8bd03a27ff16f01d478496400661617023ab64799ace81c7244d"} Mar 13 13:06:04 crc kubenswrapper[4786]: I0313 13:06:04.304911 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:04 crc kubenswrapper[4786]: I0313 13:06:04.408462 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm8qs\" (UniqueName: \"kubernetes.io/projected/1713d69b-5497-4c42-a01c-bd6dbad8dd32-kube-api-access-pm8qs\") pod \"1713d69b-5497-4c42-a01c-bd6dbad8dd32\" (UID: \"1713d69b-5497-4c42-a01c-bd6dbad8dd32\") " Mar 13 13:06:04 crc kubenswrapper[4786]: I0313 13:06:04.413758 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1713d69b-5497-4c42-a01c-bd6dbad8dd32-kube-api-access-pm8qs" (OuterVolumeSpecName: "kube-api-access-pm8qs") pod "1713d69b-5497-4c42-a01c-bd6dbad8dd32" (UID: "1713d69b-5497-4c42-a01c-bd6dbad8dd32"). InnerVolumeSpecName "kube-api-access-pm8qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:06:04 crc kubenswrapper[4786]: I0313 13:06:04.440712 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:06:04 crc kubenswrapper[4786]: E0313 13:06:04.441125 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8ncs8_openshift-machine-config-operator(75da9242-3ddf-4eca-82df-a5fc998b0fdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" Mar 13 13:06:04 crc kubenswrapper[4786]: I0313 13:06:04.509968 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm8qs\" (UniqueName: \"kubernetes.io/projected/1713d69b-5497-4c42-a01c-bd6dbad8dd32-kube-api-access-pm8qs\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.069332 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" event={"ID":"1713d69b-5497-4c42-a01c-bd6dbad8dd32","Type":"ContainerDied","Data":"45eeae48e748d8cf25676a84fc6ef34ae25632a029e29d2c783015d81b881fa3"} Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.069372 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45eeae48e748d8cf25676a84fc6ef34ae25632a029e29d2c783015d81b881fa3" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.069378 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556786-zlxg2" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.380368 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-mvm68"] Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.392446 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-mvm68"] Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.449757 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fccf9fb-7515-4672-8286-33b95ee89998" path="/var/lib/kubelet/pods/4fccf9fb-7515-4672-8286-33b95ee89998/volumes" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.587948 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bvh5m"] Mar 13 13:06:05 crc kubenswrapper[4786]: E0313 13:06:05.588283 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1713d69b-5497-4c42-a01c-bd6dbad8dd32" containerName="oc" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.588304 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1713d69b-5497-4c42-a01c-bd6dbad8dd32" containerName="oc" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.588443 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1713d69b-5497-4c42-a01c-bd6dbad8dd32" containerName="oc" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.589579 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.598831 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bvh5m"] Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.623084 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-catalog-content\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.623229 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-utilities\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.623255 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpqm\" (UniqueName: \"kubernetes.io/projected/ea894041-703f-4722-b987-4c8060822d36-kube-api-access-dwpqm\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.724223 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwpqm\" (UniqueName: \"kubernetes.io/projected/ea894041-703f-4722-b987-4c8060822d36-kube-api-access-dwpqm\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.724843 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-catalog-content\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.724953 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-utilities\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.725341 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-catalog-content\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.725486 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-utilities\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.748942 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwpqm\" (UniqueName: \"kubernetes.io/projected/ea894041-703f-4722-b987-4c8060822d36-kube-api-access-dwpqm\") pod \"community-operators-bvh5m\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:05 crc kubenswrapper[4786]: I0313 13:06:05.916093 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:06 crc kubenswrapper[4786]: I0313 13:06:06.395345 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bvh5m"] Mar 13 13:06:07 crc kubenswrapper[4786]: I0313 13:06:07.090576 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea894041-703f-4722-b987-4c8060822d36" containerID="8fcd4c4b0d813361f8a2ae3417ff2bbf9f20c9b93f9c88552e90a76a4d009361" exitCode=0 Mar 13 13:06:07 crc kubenswrapper[4786]: I0313 13:06:07.090802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerDied","Data":"8fcd4c4b0d813361f8a2ae3417ff2bbf9f20c9b93f9c88552e90a76a4d009361"} Mar 13 13:06:07 crc kubenswrapper[4786]: I0313 13:06:07.091061 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerStarted","Data":"36d7a51bb821677909e9422f3f694bcadffa6d9f6c0463e0a3d33d6431603057"} Mar 13 13:06:08 crc kubenswrapper[4786]: I0313 13:06:08.099798 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerStarted","Data":"32e0d5751453ae8d19a4609a3e5eda4fb80c7bd0c4d31b6cb91f61b4a83640ed"} Mar 13 13:06:09 crc kubenswrapper[4786]: I0313 13:06:09.108018 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea894041-703f-4722-b987-4c8060822d36" containerID="32e0d5751453ae8d19a4609a3e5eda4fb80c7bd0c4d31b6cb91f61b4a83640ed" exitCode=0 Mar 13 13:06:09 crc kubenswrapper[4786]: I0313 13:06:09.108076 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerDied","Data":"32e0d5751453ae8d19a4609a3e5eda4fb80c7bd0c4d31b6cb91f61b4a83640ed"} Mar 13 13:06:10 crc kubenswrapper[4786]: I0313 13:06:10.116291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerStarted","Data":"9a907985dfcc80f06f4079ea8b6958b95bede156915d4b18f08c909cfc52564b"} Mar 13 13:06:10 crc kubenswrapper[4786]: I0313 13:06:10.136976 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bvh5m" podStartSLOduration=2.7080482359999998 podStartE2EDuration="5.136957103s" podCreationTimestamp="2026-03-13 13:06:05 +0000 UTC" firstStartedPulling="2026-03-13 13:06:07.093064631 +0000 UTC m=+4754.372718078" lastFinishedPulling="2026-03-13 13:06:09.521973488 +0000 UTC m=+4756.801626945" observedRunningTime="2026-03-13 13:06:10.132272216 +0000 UTC m=+4757.411925663" watchObservedRunningTime="2026-03-13 13:06:10.136957103 +0000 UTC m=+4757.416610550" Mar 13 13:06:15 crc kubenswrapper[4786]: I0313 13:06:15.916741 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:15 crc kubenswrapper[4786]: I0313 13:06:15.917312 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:15 crc kubenswrapper[4786]: I0313 13:06:15.959475 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:16 crc kubenswrapper[4786]: I0313 13:06:16.193102 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:16 crc kubenswrapper[4786]: I0313 13:06:16.241908 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bvh5m"] Mar 13 13:06:16 crc kubenswrapper[4786]: I0313 13:06:16.440829 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:06:17 crc kubenswrapper[4786]: I0313 13:06:17.164781 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"ed75f631c4a09fe1dd62b258181c03ad5318f839913c480f2cabad9f2ce5eca0"} Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.171000 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bvh5m" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="registry-server" containerID="cri-o://9a907985dfcc80f06f4079ea8b6958b95bede156915d4b18f08c909cfc52564b" gracePeriod=2 Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.605393 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v52mq"] Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.607042 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.613365 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v52mq"] Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.717192 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqt9\" (UniqueName: \"kubernetes.io/projected/421ee106-98a9-4361-846c-e7a957d90217-kube-api-access-kkqt9\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.717265 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-utilities\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.717510 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-catalog-content\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.818395 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-catalog-content\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.818592 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqt9\" (UniqueName: \"kubernetes.io/projected/421ee106-98a9-4361-846c-e7a957d90217-kube-api-access-kkqt9\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.818696 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-utilities\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.818974 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-catalog-content\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.819304 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-utilities\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.856505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqt9\" (UniqueName: \"kubernetes.io/projected/421ee106-98a9-4361-846c-e7a957d90217-kube-api-access-kkqt9\") pod \"redhat-operators-v52mq\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:18 crc kubenswrapper[4786]: I0313 13:06:18.927537 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.022116 4786 scope.go:117] "RemoveContainer" containerID="9f42ed417edab316bc9fcd7dafe5dd394e8eb7949e359c1b9af92022c8284a88" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.179202 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea894041-703f-4722-b987-4c8060822d36" containerID="9a907985dfcc80f06f4079ea8b6958b95bede156915d4b18f08c909cfc52564b" exitCode=0 Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.179261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerDied","Data":"9a907985dfcc80f06f4079ea8b6958b95bede156915d4b18f08c909cfc52564b"} Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.312018 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.426257 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-utilities\") pod \"ea894041-703f-4722-b987-4c8060822d36\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.426591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-catalog-content\") pod \"ea894041-703f-4722-b987-4c8060822d36\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.426703 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwpqm\" (UniqueName: \"kubernetes.io/projected/ea894041-703f-4722-b987-4c8060822d36-kube-api-access-dwpqm\") pod \"ea894041-703f-4722-b987-4c8060822d36\" (UID: \"ea894041-703f-4722-b987-4c8060822d36\") " Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.427848 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-utilities" (OuterVolumeSpecName: "utilities") pod "ea894041-703f-4722-b987-4c8060822d36" (UID: "ea894041-703f-4722-b987-4c8060822d36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.451900 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea894041-703f-4722-b987-4c8060822d36-kube-api-access-dwpqm" (OuterVolumeSpecName: "kube-api-access-dwpqm") pod "ea894041-703f-4722-b987-4c8060822d36" (UID: "ea894041-703f-4722-b987-4c8060822d36"). InnerVolumeSpecName "kube-api-access-dwpqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.528602 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea894041-703f-4722-b987-4c8060822d36" (UID: "ea894041-703f-4722-b987-4c8060822d36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.529467 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwpqm\" (UniqueName: \"kubernetes.io/projected/ea894041-703f-4722-b987-4c8060822d36-kube-api-access-dwpqm\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.529493 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.529502 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea894041-703f-4722-b987-4c8060822d36-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:19 crc kubenswrapper[4786]: I0313 13:06:19.807086 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v52mq"] Mar 13 13:06:19 crc kubenswrapper[4786]: W0313 13:06:19.818911 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421ee106_98a9_4361_846c_e7a957d90217.slice/crio-ec7667ac3a68269379072fb5a563890bcb0e0c0c35331ad6dda1ba33da7ba474 WatchSource:0}: Error finding container ec7667ac3a68269379072fb5a563890bcb0e0c0c35331ad6dda1ba33da7ba474: Status 404 returned error can't find the container with id ec7667ac3a68269379072fb5a563890bcb0e0c0c35331ad6dda1ba33da7ba474 Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.187054 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvh5m" event={"ID":"ea894041-703f-4722-b987-4c8060822d36","Type":"ContainerDied","Data":"36d7a51bb821677909e9422f3f694bcadffa6d9f6c0463e0a3d33d6431603057"} Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.187350 4786 scope.go:117] "RemoveContainer" containerID="9a907985dfcc80f06f4079ea8b6958b95bede156915d4b18f08c909cfc52564b" Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.187480 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvh5m" Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.195351 4786 generic.go:334] "Generic (PLEG): container finished" podID="421ee106-98a9-4361-846c-e7a957d90217" containerID="9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b" exitCode=0 Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.195393 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerDied","Data":"9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b"} Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.195418 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerStarted","Data":"ec7667ac3a68269379072fb5a563890bcb0e0c0c35331ad6dda1ba33da7ba474"} Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.217487 4786 scope.go:117] "RemoveContainer" containerID="32e0d5751453ae8d19a4609a3e5eda4fb80c7bd0c4d31b6cb91f61b4a83640ed" Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.237401 4786 scope.go:117] "RemoveContainer" containerID="8fcd4c4b0d813361f8a2ae3417ff2bbf9f20c9b93f9c88552e90a76a4d009361" Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.243504 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bvh5m"] Mar 13 13:06:20 crc kubenswrapper[4786]: I0313 13:06:20.251161 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bvh5m"] Mar 13 13:06:21 crc kubenswrapper[4786]: I0313 13:06:21.204704 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerStarted","Data":"cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2"} Mar 13 13:06:21 crc kubenswrapper[4786]: I0313 13:06:21.449002 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea894041-703f-4722-b987-4c8060822d36" path="/var/lib/kubelet/pods/ea894041-703f-4722-b987-4c8060822d36/volumes" Mar 13 13:06:22 crc kubenswrapper[4786]: I0313 13:06:22.214352 4786 generic.go:334] "Generic (PLEG): container finished" podID="421ee106-98a9-4361-846c-e7a957d90217" containerID="cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2" exitCode=0 Mar 13 13:06:22 crc kubenswrapper[4786]: I0313 13:06:22.214442 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerDied","Data":"cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2"} Mar 13 13:06:23 crc kubenswrapper[4786]: I0313 13:06:23.229051 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerStarted","Data":"1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad"} Mar 13 13:06:23 crc kubenswrapper[4786]: I0313 13:06:23.247046 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v52mq" podStartSLOduration=2.832637628 podStartE2EDuration="5.24702572s" podCreationTimestamp="2026-03-13 13:06:18 +0000 UTC" firstStartedPulling="2026-03-13 13:06:20.198219064 +0000 UTC m=+4767.477872511" lastFinishedPulling="2026-03-13 13:06:22.612607156 +0000 UTC m=+4769.892260603" observedRunningTime="2026-03-13 13:06:23.244900933 +0000 UTC m=+4770.524554410" watchObservedRunningTime="2026-03-13 13:06:23.24702572 +0000 UTC m=+4770.526679177" Mar 13 13:06:28 crc kubenswrapper[4786]: I0313 13:06:28.928573 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:28 crc kubenswrapper[4786]: I0313 13:06:28.929091 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:29 crc kubenswrapper[4786]: I0313 13:06:29.968392 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v52mq" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="registry-server" probeResult="failure" output=< Mar 13 13:06:29 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 13:06:29 crc kubenswrapper[4786]: > Mar 13 13:06:38 crc kubenswrapper[4786]: I0313 13:06:38.970544 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:39 crc kubenswrapper[4786]: I0313 13:06:39.013120 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:39 crc kubenswrapper[4786]: I0313 13:06:39.206746 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v52mq"] Mar 13 13:06:40 crc kubenswrapper[4786]: I0313 13:06:40.377390 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v52mq" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="registry-server" containerID="cri-o://1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad" gracePeriod=2 Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.754924 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.857940 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-catalog-content\") pod \"421ee106-98a9-4361-846c-e7a957d90217\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.858001 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-utilities\") pod \"421ee106-98a9-4361-846c-e7a957d90217\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.858053 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkqt9\" (UniqueName: \"kubernetes.io/projected/421ee106-98a9-4361-846c-e7a957d90217-kube-api-access-kkqt9\") pod \"421ee106-98a9-4361-846c-e7a957d90217\" (UID: \"421ee106-98a9-4361-846c-e7a957d90217\") " Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.859860 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-utilities" (OuterVolumeSpecName: "utilities") pod "421ee106-98a9-4361-846c-e7a957d90217" (UID: "421ee106-98a9-4361-846c-e7a957d90217"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.959523 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:40.992787 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "421ee106-98a9-4361-846c-e7a957d90217" (UID: "421ee106-98a9-4361-846c-e7a957d90217"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.060737 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/421ee106-98a9-4361-846c-e7a957d90217-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.386000 4786 generic.go:334] "Generic (PLEG): container finished" podID="421ee106-98a9-4361-846c-e7a957d90217" containerID="1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad" exitCode=0 Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.386046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerDied","Data":"1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad"} Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.386080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v52mq" event={"ID":"421ee106-98a9-4361-846c-e7a957d90217","Type":"ContainerDied","Data":"ec7667ac3a68269379072fb5a563890bcb0e0c0c35331ad6dda1ba33da7ba474"} Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.386083 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v52mq" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.386100 4786 scope.go:117] "RemoveContainer" containerID="1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.411004 4786 scope.go:117] "RemoveContainer" containerID="cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.828683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421ee106-98a9-4361-846c-e7a957d90217-kube-api-access-kkqt9" (OuterVolumeSpecName: "kube-api-access-kkqt9") pod "421ee106-98a9-4361-846c-e7a957d90217" (UID: "421ee106-98a9-4361-846c-e7a957d90217"). InnerVolumeSpecName "kube-api-access-kkqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.840467 4786 scope.go:117] "RemoveContainer" containerID="9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.858967 4786 scope.go:117] "RemoveContainer" containerID="1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad" Mar 13 13:06:41 crc kubenswrapper[4786]: E0313 13:06:41.861316 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad\": container with ID starting with 1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad not found: ID does not exist" containerID="1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.861352 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad"} err="failed to get container status \"1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad\": rpc error: code = NotFound desc = could not find container \"1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad\": container with ID starting with 1c5299139b477e3dff171aa9ef08481053143dc3251ee0f7b093280d43d491ad not found: ID does not exist" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.861373 4786 scope.go:117] "RemoveContainer" containerID="cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2" Mar 13 13:06:41 crc kubenswrapper[4786]: E0313 13:06:41.864356 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2\": container with ID starting with cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2 not found: ID does not exist" containerID="cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.864406 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2"} err="failed to get container status \"cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2\": rpc error: code = NotFound desc = could not find container \"cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2\": container with ID starting with cfc9c5798d976e8a2a34817d2bd2c213764895ad679f39df08009958a8ab7fd2 not found: ID does not exist" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.864486 4786 scope.go:117] "RemoveContainer" containerID="9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b" Mar 13 13:06:41 crc kubenswrapper[4786]: E0313 13:06:41.868231 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b\": container with ID starting with 9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b not found: ID does not exist" containerID="9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.868277 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b"} err="failed to get container status \"9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b\": rpc error: code = NotFound desc = could not find container \"9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b\": container with ID starting with 9402cb7cb9a8ef92f4203a0f00be7b45192dac3d4f8ef74b072ecb74fc50e31b not found: ID does not exist" Mar 13 13:06:41 crc kubenswrapper[4786]: I0313 13:06:41.876098 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkqt9\" (UniqueName: \"kubernetes.io/projected/421ee106-98a9-4361-846c-e7a957d90217-kube-api-access-kkqt9\") on node \"crc\" DevicePath \"\"" Mar 13 13:06:42 crc kubenswrapper[4786]: I0313 13:06:42.012377 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v52mq"] Mar 13 13:06:42 crc kubenswrapper[4786]: I0313 13:06:42.021375 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v52mq"] Mar 13 13:06:43 crc kubenswrapper[4786]: I0313 13:06:43.449260 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421ee106-98a9-4361-846c-e7a957d90217" path="/var/lib/kubelet/pods/421ee106-98a9-4361-846c-e7a957d90217/volumes" Mar 13 13:06:48 crc kubenswrapper[4786]: I0313 13:06:48.234369 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-wk47l_47c63b16-0044-4bef-848e-084b958e853b/manager/0.log" Mar 13 13:06:48 crc kubenswrapper[4786]: I0313 13:06:48.678835 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/util/0.log" Mar 13 13:06:48 crc kubenswrapper[4786]: I0313 13:06:48.861921 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/pull/0.log" Mar 13 13:06:48 crc kubenswrapper[4786]: I0313 13:06:48.899828 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/util/0.log" Mar 13 13:06:49 crc kubenswrapper[4786]: I0313 13:06:49.074560 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/pull/0.log" Mar 13 13:06:49 crc kubenswrapper[4786]: I0313 13:06:49.268239 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/util/0.log" Mar 13 13:06:49 crc kubenswrapper[4786]: I0313 13:06:49.302966 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/pull/0.log" Mar 13 13:06:49 crc kubenswrapper[4786]: I0313 13:06:49.454806 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477mv7bw_03cba3c5-ae6a-4348-9c80-f38790f5b763/extract/0.log" Mar 13 13:06:49 crc kubenswrapper[4786]: I0313 13:06:49.799980 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-kgm9l_8ba2beff-196b-4a24-a490-86a81b9f7495/manager/0.log" Mar 13 13:06:49 crc kubenswrapper[4786]: I0313 13:06:49.937513 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-4jnjg_17f6df05-f37f-4863-b967-7b27429282f2/manager/0.log" Mar 13 13:06:50 crc kubenswrapper[4786]: I0313 13:06:50.106126 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-9w5pr_77819c69-e0b5-4eb8-a124-fb1339701ccb/manager/0.log" Mar 13 13:06:50 crc kubenswrapper[4786]: I0313 13:06:50.518663 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-652d5_4eb75275-4f14-406c-950a-fa40061041af/manager/0.log" Mar 13 13:06:50 crc kubenswrapper[4786]: I0313 13:06:50.607757 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-mf6t8_e4075ad0-d00e-4675-97e6-87e1d7e845d9/manager/0.log" Mar 13 13:06:50 crc kubenswrapper[4786]: I0313 13:06:50.670638 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-thr75_784ee575-162b-4732-b82c-8f4b3c1e5317/manager/0.log" Mar 13 13:06:50 crc kubenswrapper[4786]: I0313 13:06:50.956719 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-6dcrg_ad8d13c6-f90b-4eb4-adce-1d20f690cc98/manager/0.log" Mar 13 13:06:50 crc kubenswrapper[4786]: I0313 13:06:50.960002 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-lj4bv_f87ad580-b279-47e4-8fdd-462285c7bead/manager/0.log" Mar 13 13:06:51 crc kubenswrapper[4786]: I0313 13:06:51.216205 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-rdph5_8c1d644e-a547-48c7-bda5-95cdb6c0220f/manager/0.log" Mar 13 13:06:51 crc kubenswrapper[4786]: I0313 13:06:51.424926 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-stg56_1073de3d-8bac-4236-a9ce-c78d7bb2865b/manager/0.log" Mar 13 13:06:51 crc kubenswrapper[4786]: I0313 13:06:51.570978 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-fnchb_8902cfaa-8c11-4e52-9f6d-d579e6cd50f5/manager/0.log" Mar 13 13:06:51 crc kubenswrapper[4786]: I0313 13:06:51.625750 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-l2vgk_52ab49ac-37a7-4ba5-a2c3-9113b6821a5d/manager/0.log" Mar 13 13:06:51 crc kubenswrapper[4786]: I0313 13:06:51.779680 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c969dbbcd-ck87b_3aeac64d-7cf0-407c-a460-423a0082a8e9/manager/0.log" Mar 13 13:06:52 crc kubenswrapper[4786]: I0313 13:06:52.060541 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b9994cf8-cgmkc_f0b720fc-612f-48bb-9681-9fc6c6b102f4/operator/0.log" Mar 13 13:06:52 crc kubenswrapper[4786]: I0313 13:06:52.268070 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-g8xgg_c133d0bb-ca55-4518-8150-5f2e1ab0dbe3/registry-server/0.log" Mar 13 13:06:52 crc kubenswrapper[4786]: I0313 13:06:52.388126 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-4skd6_810beff6-dacb-486e-be5b-fc4ad06e12d3/manager/0.log" Mar 13 13:06:52 crc kubenswrapper[4786]: I0313 13:06:52.558726 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-c284s_d5b27da0-841c-49b1-b761-a9f61a402f6c/manager/0.log" Mar 13 13:06:52 crc kubenswrapper[4786]: I0313 13:06:52.730169 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-22b2x_c087892e-22b2-4552-a57f-e1c1d75b7917/operator/0.log" Mar 13 13:06:52 crc kubenswrapper[4786]: I0313 13:06:52.848157 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-8qxfc_f53bfcff-fb8e-46d3-8818-39147c6ac29b/manager/0.log" Mar 13 13:06:53 crc kubenswrapper[4786]: I0313 13:06:53.032937 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-gw46v_9c83fbda-99d5-4661-9ca4-24008f71bb98/manager/0.log" Mar 13 13:06:53 crc kubenswrapper[4786]: I0313 13:06:53.115633 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7795b46f77-5x2pg_9b7d27b4-b437-4bfb-b888-97b406ceb185/manager/0.log" Mar 13 13:06:53 crc kubenswrapper[4786]: I0313 13:06:53.197691 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-pl6k5_d742678a-b8a2-409a-932d-3b7002db7636/manager/0.log" Mar 13 13:06:53 crc kubenswrapper[4786]: I0313 13:06:53.292102 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-4n8g4_c18db346-860e-487c-b232-6f404fdb1b7c/manager/0.log" Mar 13 13:06:58 crc kubenswrapper[4786]: I0313 13:06:58.028668 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-wrxdd_28f6ab30-9436-45e7-a94f-b9757e0dc331/manager/0.log" Mar 13 13:07:14 crc kubenswrapper[4786]: I0313 13:07:14.112588 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5h7bj_d94210f8-5f1d-4fa5-8954-14d18f8fa0e4/control-plane-machine-set-operator/0.log" Mar 13 13:07:14 crc kubenswrapper[4786]: I0313 13:07:14.287430 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7tbp9_01831a9e-e080-4ede-905a-34277de02b46/kube-rbac-proxy/0.log" Mar 13 13:07:14 crc kubenswrapper[4786]: I0313 13:07:14.291934 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7tbp9_01831a9e-e080-4ede-905a-34277de02b46/machine-api-operator/0.log" Mar 13 13:07:27 crc kubenswrapper[4786]: I0313 13:07:27.375098 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-dcm9k_4ab306ea-196b-40b7-b016-1f29d639935b/cert-manager-controller/0.log" Mar 13 13:07:27 crc kubenswrapper[4786]: I0313 13:07:27.542379 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-2xpth_5ea8859a-f224-44ba-b451-fdf4f6401cfc/cert-manager-cainjector/0.log" Mar 13 13:07:27 crc kubenswrapper[4786]: I0313 13:07:27.581814 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-lw7q8_20a46b3c-810e-4c07-974e-42bf0a40efc1/cert-manager-webhook/0.log" Mar 13 13:07:39 crc kubenswrapper[4786]: I0313 13:07:39.740866 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-pgk4q_88668af0-a94b-4bed-a518-e18d2ac8692d/nmstate-console-plugin/0.log" Mar 13 13:07:39 crc kubenswrapper[4786]: I0313 13:07:39.942656 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kh9b9_9d4d1448-188d-4c49-b287-8a7bc2298b06/nmstate-handler/0.log" Mar 13 13:07:39 crc kubenswrapper[4786]: I0313 13:07:39.953785 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-pbscj_38c81a5f-f37d-4dc5-aad9-ffe72690e341/kube-rbac-proxy/0.log" Mar 13 13:07:40 crc kubenswrapper[4786]: I0313 13:07:40.099141 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-pbscj_38c81a5f-f37d-4dc5-aad9-ffe72690e341/nmstate-metrics/0.log" Mar 13 13:07:40 crc kubenswrapper[4786]: I0313 13:07:40.160230 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-6q6x7_15ab9de9-0f55-424e-9717-d5452bcefb67/nmstate-operator/0.log" Mar 13 13:07:40 crc kubenswrapper[4786]: I0313 13:07:40.463939 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-7l6wg_1df03c26-f726-4375-b92a-1d304e653168/nmstate-webhook/0.log" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.139311 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556788-9f6w8"] Mar 13 13:08:00 crc kubenswrapper[4786]: E0313 13:08:00.140109 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="extract-content" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140121 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="extract-content" Mar 13 13:08:00 crc kubenswrapper[4786]: E0313 13:08:00.140131 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="extract-content" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140136 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="extract-content" Mar 13 13:08:00 crc kubenswrapper[4786]: E0313 13:08:00.140152 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="registry-server" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140159 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="registry-server" Mar 13 13:08:00 crc kubenswrapper[4786]: E0313 13:08:00.140167 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="extract-utilities" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140174 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="extract-utilities" Mar 13 13:08:00 crc kubenswrapper[4786]: E0313 13:08:00.140188 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="registry-server" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140194 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="registry-server" Mar 13 13:08:00 crc kubenswrapper[4786]: E0313 13:08:00.140209 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="extract-utilities" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140215 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="extract-utilities" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140340 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea894041-703f-4722-b987-4c8060822d36" containerName="registry-server" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140357 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="421ee106-98a9-4361-846c-e7a957d90217" containerName="registry-server" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.140760 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.144382 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.144686 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.147598 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.159191 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556788-9f6w8"] Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.222896 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rtm8\" (UniqueName: \"kubernetes.io/projected/d31b5990-ed7e-47cf-9e0c-397c9344863e-kube-api-access-5rtm8\") pod \"auto-csr-approver-29556788-9f6w8\" (UID: \"d31b5990-ed7e-47cf-9e0c-397c9344863e\") " pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.324591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rtm8\" (UniqueName: \"kubernetes.io/projected/d31b5990-ed7e-47cf-9e0c-397c9344863e-kube-api-access-5rtm8\") pod \"auto-csr-approver-29556788-9f6w8\" (UID: \"d31b5990-ed7e-47cf-9e0c-397c9344863e\") " pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.351456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rtm8\" (UniqueName: \"kubernetes.io/projected/d31b5990-ed7e-47cf-9e0c-397c9344863e-kube-api-access-5rtm8\") pod \"auto-csr-approver-29556788-9f6w8\" (UID: \"d31b5990-ed7e-47cf-9e0c-397c9344863e\") " pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.463827 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:00 crc kubenswrapper[4786]: I0313 13:08:00.908410 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556788-9f6w8"] Mar 13 13:08:00 crc kubenswrapper[4786]: W0313 13:08:00.911385 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd31b5990_ed7e_47cf_9e0c_397c9344863e.slice/crio-b3d00fc3a007d90503eba1269c1ba12d61d287e38ad2b84be7b563f4066df0be WatchSource:0}: Error finding container b3d00fc3a007d90503eba1269c1ba12d61d287e38ad2b84be7b563f4066df0be: Status 404 returned error can't find the container with id b3d00fc3a007d90503eba1269c1ba12d61d287e38ad2b84be7b563f4066df0be Mar 13 13:08:01 crc kubenswrapper[4786]: I0313 13:08:01.914400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" event={"ID":"d31b5990-ed7e-47cf-9e0c-397c9344863e","Type":"ContainerStarted","Data":"c5d40d57a906d453c8bfc706f0632d7955a2f73d27cb70049960cd988a1f0358"} Mar 13 13:08:01 crc kubenswrapper[4786]: I0313 13:08:01.914706 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" event={"ID":"d31b5990-ed7e-47cf-9e0c-397c9344863e","Type":"ContainerStarted","Data":"b3d00fc3a007d90503eba1269c1ba12d61d287e38ad2b84be7b563f4066df0be"} Mar 13 13:08:01 crc kubenswrapper[4786]: I0313 13:08:01.931511 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" podStartSLOduration=1.187693984 podStartE2EDuration="1.931491636s" podCreationTimestamp="2026-03-13 13:08:00 +0000 UTC" firstStartedPulling="2026-03-13 13:08:00.913471087 +0000 UTC m=+4868.193124534" lastFinishedPulling="2026-03-13 13:08:01.657268739 +0000 UTC m=+4868.936922186" observedRunningTime="2026-03-13 13:08:01.924356102 +0000 UTC m=+4869.204009549" watchObservedRunningTime="2026-03-13 13:08:01.931491636 +0000 UTC m=+4869.211145083" Mar 13 13:08:02 crc kubenswrapper[4786]: I0313 13:08:02.922964 4786 generic.go:334] "Generic (PLEG): container finished" podID="d31b5990-ed7e-47cf-9e0c-397c9344863e" containerID="c5d40d57a906d453c8bfc706f0632d7955a2f73d27cb70049960cd988a1f0358" exitCode=0 Mar 13 13:08:02 crc kubenswrapper[4786]: I0313 13:08:02.923019 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" event={"ID":"d31b5990-ed7e-47cf-9e0c-397c9344863e","Type":"ContainerDied","Data":"c5d40d57a906d453c8bfc706f0632d7955a2f73d27cb70049960cd988a1f0358"} Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.169168 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.276066 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rtm8\" (UniqueName: \"kubernetes.io/projected/d31b5990-ed7e-47cf-9e0c-397c9344863e-kube-api-access-5rtm8\") pod \"d31b5990-ed7e-47cf-9e0c-397c9344863e\" (UID: \"d31b5990-ed7e-47cf-9e0c-397c9344863e\") " Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.283094 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31b5990-ed7e-47cf-9e0c-397c9344863e-kube-api-access-5rtm8" (OuterVolumeSpecName: "kube-api-access-5rtm8") pod "d31b5990-ed7e-47cf-9e0c-397c9344863e" (UID: "d31b5990-ed7e-47cf-9e0c-397c9344863e"). InnerVolumeSpecName "kube-api-access-5rtm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.377305 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rtm8\" (UniqueName: \"kubernetes.io/projected/d31b5990-ed7e-47cf-9e0c-397c9344863e-kube-api-access-5rtm8\") on node \"crc\" DevicePath \"\"" Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.951069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" event={"ID":"d31b5990-ed7e-47cf-9e0c-397c9344863e","Type":"ContainerDied","Data":"b3d00fc3a007d90503eba1269c1ba12d61d287e38ad2b84be7b563f4066df0be"} Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.951127 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3d00fc3a007d90503eba1269c1ba12d61d287e38ad2b84be7b563f4066df0be" Mar 13 13:08:04 crc kubenswrapper[4786]: I0313 13:08:04.951213 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556788-9f6w8" Mar 13 13:08:05 crc kubenswrapper[4786]: I0313 13:08:05.007082 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556782-hdfbf"] Mar 13 13:08:05 crc kubenswrapper[4786]: I0313 13:08:05.009258 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556782-hdfbf"] Mar 13 13:08:05 crc kubenswrapper[4786]: I0313 13:08:05.451090 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10371942-9d00-4050-9660-0d53bd08e6b7" path="/var/lib/kubelet/pods/10371942-9d00-4050-9660-0d53bd08e6b7/volumes" Mar 13 13:08:07 crc kubenswrapper[4786]: I0313 13:08:07.543494 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-f4p6b_2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2/kube-rbac-proxy/0.log" Mar 13 13:08:07 crc kubenswrapper[4786]: I0313 13:08:07.720480 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-frr-files/0.log" Mar 13 13:08:07 crc kubenswrapper[4786]: I0313 13:08:07.908533 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-f4p6b_2fdfe97b-ebe7-4ccc-bf3e-0a9bd375f7a2/controller/0.log" Mar 13 13:08:07 crc kubenswrapper[4786]: I0313 13:08:07.971291 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-reloader/0.log" Mar 13 13:08:07 crc kubenswrapper[4786]: I0313 13:08:07.992301 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-frr-files/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.027193 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-metrics/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.141738 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-reloader/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.281835 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-metrics/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.285083 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-frr-files/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.318850 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-reloader/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.346080 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-metrics/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.533424 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-frr-files/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.545434 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-metrics/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.557128 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/cp-reloader/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.560476 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/controller/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.723322 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/frr-metrics/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.762055 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/kube-rbac-proxy-frr/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.770420 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/kube-rbac-proxy/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.942640 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-pxszh_f0ef5780-741a-4adc-a453-fcf5f4a8813e/frr-k8s-webhook-server/0.log" Mar 13 13:08:08 crc kubenswrapper[4786]: I0313 13:08:08.944519 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/reloader/0.log" Mar 13 13:08:09 crc kubenswrapper[4786]: I0313 13:08:09.144986 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5f7647b4b8-hmxxz_d7f40266-579b-4e46-8c76-6085fc8b2824/manager/0.log" Mar 13 13:08:09 crc kubenswrapper[4786]: I0313 13:08:09.383553 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6958cc8947-vbltt_d4473a40-978f-429a-81cb-34ca70c51ecc/webhook-server/0.log" Mar 13 13:08:09 crc kubenswrapper[4786]: I0313 13:08:09.531276 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gj86q_c9500808-ad3b-464f-ac49-ddb93b08f58e/kube-rbac-proxy/0.log" Mar 13 13:08:10 crc kubenswrapper[4786]: I0313 13:08:10.176801 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gj86q_c9500808-ad3b-464f-ac49-ddb93b08f58e/speaker/0.log" Mar 13 13:08:10 crc kubenswrapper[4786]: I0313 13:08:10.291420 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mz2wz_62899598-48ee-4c70-8641-5f7defde9e8f/frr/0.log" Mar 13 13:08:19 crc kubenswrapper[4786]: I0313 13:08:19.359399 4786 scope.go:117] "RemoveContainer" containerID="182377278b7c09698facfc29bec1b26c273c42fb9f13ed0903a3fe5a6d8e4d11" Mar 13 13:08:21 crc kubenswrapper[4786]: I0313 13:08:21.803642 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/util/0.log" Mar 13 13:08:21 crc kubenswrapper[4786]: I0313 13:08:21.986805 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/util/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.032222 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/pull/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.051466 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/pull/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.207728 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/pull/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.215174 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/util/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.232125 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874sxgrb_2e5b85ae-234c-423a-bae6-0a3bbe74f5c5/extract/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.401995 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/util/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.539720 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/util/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.550865 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/pull/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.560555 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/pull/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.715689 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/util/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.735788 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/pull/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.755211 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c176nvv_cc164e59-8f60-4750-ab7b-935346318ac8/extract/0.log" Mar 13 13:08:22 crc kubenswrapper[4786]: I0313 13:08:22.883002 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/util/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.079155 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/pull/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.107747 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/util/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.113399 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/pull/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.265895 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/util/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.297489 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/pull/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.299816 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5cx8qd_23df2d8e-3fd0-4358-a8d0-4e9f65c28abd/extract/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.455593 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/extract-utilities/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.598919 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/extract-utilities/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.610179 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/extract-content/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.681180 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/extract-content/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.767480 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/extract-utilities/0.log" Mar 13 13:08:23 crc kubenswrapper[4786]: I0313 13:08:23.767565 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/extract-content/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.034098 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/extract-utilities/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.209790 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/extract-content/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.229615 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/extract-utilities/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.309599 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/extract-content/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.424915 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jjl8g_0f417ecf-9afd-4518-b54c-dbfedb17c67a/registry-server/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.446154 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/extract-utilities/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.471002 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/extract-content/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.669907 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rl4k6_f4cef03e-60f8-491b-9ba5-b93a42121b2e/marketplace-operator/0.log" Mar 13 13:08:24 crc kubenswrapper[4786]: I0313 13:08:24.872163 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/extract-utilities/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.033755 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/extract-utilities/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.092798 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/extract-content/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.109695 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/extract-content/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.198139 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbxlz_0a8284a0-87cc-4a4f-9498-f7103367855e/registry-server/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.274980 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/extract-utilities/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.301910 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/extract-content/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.483681 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cwtl5_5f6e6ad0-c921-4367-82d1-0f0c4c4bcba5/registry-server/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.484264 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/extract-utilities/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.637495 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/extract-content/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.637511 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/extract-content/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.652458 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/extract-utilities/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.807992 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/extract-content/0.log" Mar 13 13:08:25 crc kubenswrapper[4786]: I0313 13:08:25.840101 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/extract-utilities/0.log" Mar 13 13:08:26 crc kubenswrapper[4786]: I0313 13:08:26.395609 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vv77w_ae33a694-0398-4129-9926-1b6dcb6ecc40/registry-server/0.log" Mar 13 13:08:38 crc kubenswrapper[4786]: I0313 13:08:38.168693 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:08:38 crc kubenswrapper[4786]: I0313 13:08:38.169258 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:09:08 crc kubenswrapper[4786]: I0313 13:09:08.169214 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:09:08 crc kubenswrapper[4786]: I0313 13:09:08.169664 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.169980 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.170619 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.170705 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.171765 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed75f631c4a09fe1dd62b258181c03ad5318f839913c480f2cabad9f2ce5eca0"} pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.171845 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" containerID="cri-o://ed75f631c4a09fe1dd62b258181c03ad5318f839913c480f2cabad9f2ce5eca0" gracePeriod=600 Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.650484 4786 generic.go:334] "Generic (PLEG): container finished" podID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerID="ed75f631c4a09fe1dd62b258181c03ad5318f839913c480f2cabad9f2ce5eca0" exitCode=0 Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.650568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerDied","Data":"ed75f631c4a09fe1dd62b258181c03ad5318f839913c480f2cabad9f2ce5eca0"} Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.650853 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" event={"ID":"75da9242-3ddf-4eca-82df-a5fc998b0fdc","Type":"ContainerStarted","Data":"4b7651fa1578a6a7683c4872a4debadcec77ee6be5e836f69e21c222009ccd27"} Mar 13 13:09:38 crc kubenswrapper[4786]: I0313 13:09:38.650898 4786 scope.go:117] "RemoveContainer" containerID="2bc5d74a50c566b90b0cd0e312745d804c7ea714ce6440362a5342ef23833fab" Mar 13 13:09:40 crc kubenswrapper[4786]: I0313 13:09:40.669639 4786 generic.go:334] "Generic (PLEG): container finished" podID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerID="6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277" exitCode=0 Mar 13 13:09:40 crc kubenswrapper[4786]: I0313 13:09:40.669712 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pd5br/must-gather-zk95r" event={"ID":"c9c178b0-a334-4f29-a5c5-c7d86e67f16b","Type":"ContainerDied","Data":"6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277"} Mar 13 13:09:40 crc kubenswrapper[4786]: I0313 13:09:40.671617 4786 scope.go:117] "RemoveContainer" containerID="6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277" Mar 13 13:09:41 crc kubenswrapper[4786]: I0313 13:09:41.171874 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pd5br_must-gather-zk95r_c9c178b0-a334-4f29-a5c5-c7d86e67f16b/gather/0.log" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.186340 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pd5br/must-gather-zk95r"] Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.188954 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pd5br/must-gather-zk95r" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="copy" containerID="cri-o://a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d" gracePeriod=2 Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.197862 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pd5br/must-gather-zk95r"] Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.540947 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pd5br_must-gather-zk95r_c9c178b0-a334-4f29-a5c5-c7d86e67f16b/copy/0.log" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.541819 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.686586 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-must-gather-output\") pod \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.686766 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97hgs\" (UniqueName: \"kubernetes.io/projected/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-kube-api-access-97hgs\") pod \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\" (UID: \"c9c178b0-a334-4f29-a5c5-c7d86e67f16b\") " Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.694161 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-kube-api-access-97hgs" (OuterVolumeSpecName: "kube-api-access-97hgs") pod "c9c178b0-a334-4f29-a5c5-c7d86e67f16b" (UID: "c9c178b0-a334-4f29-a5c5-c7d86e67f16b"). InnerVolumeSpecName "kube-api-access-97hgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.737016 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pd5br_must-gather-zk95r_c9c178b0-a334-4f29-a5c5-c7d86e67f16b/copy/0.log" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.737770 4786 generic.go:334] "Generic (PLEG): container finished" podID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerID="a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d" exitCode=143 Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.737822 4786 scope.go:117] "RemoveContainer" containerID="a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.737894 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pd5br/must-gather-zk95r" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.771230 4786 scope.go:117] "RemoveContainer" containerID="6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.788453 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97hgs\" (UniqueName: \"kubernetes.io/projected/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-kube-api-access-97hgs\") on node \"crc\" DevicePath \"\"" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.810347 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c9c178b0-a334-4f29-a5c5-c7d86e67f16b" (UID: "c9c178b0-a334-4f29-a5c5-c7d86e67f16b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.836005 4786 scope.go:117] "RemoveContainer" containerID="a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d" Mar 13 13:09:48 crc kubenswrapper[4786]: E0313 13:09:48.836591 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d\": container with ID starting with a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d not found: ID does not exist" containerID="a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.836629 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d"} err="failed to get container status \"a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d\": rpc error: code = NotFound desc = could not find container \"a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d\": container with ID starting with a555bf84254fd57ab4acc9f8923386673623e7bb7a64b59cf9a458610a7d4a3d not found: ID does not exist" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.836650 4786 scope.go:117] "RemoveContainer" containerID="6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277" Mar 13 13:09:48 crc kubenswrapper[4786]: E0313 13:09:48.838083 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277\": container with ID starting with 6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277 not found: ID does not exist" containerID="6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.838134 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277"} err="failed to get container status \"6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277\": rpc error: code = NotFound desc = could not find container \"6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277\": container with ID starting with 6720b6072e68e0a92edfb159b27e9bbd4862be5c929435062f90178e64c3b277 not found: ID does not exist" Mar 13 13:09:48 crc kubenswrapper[4786]: I0313 13:09:48.889771 4786 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c178b0-a334-4f29-a5c5-c7d86e67f16b-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 13:09:49 crc kubenswrapper[4786]: I0313 13:09:49.450857 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" path="/var/lib/kubelet/pods/c9c178b0-a334-4f29-a5c5-c7d86e67f16b/volumes" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.141988 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556790-57bjg"] Mar 13 13:10:00 crc kubenswrapper[4786]: E0313 13:10:00.142810 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="copy" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.142823 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="copy" Mar 13 13:10:00 crc kubenswrapper[4786]: E0313 13:10:00.142836 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="gather" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.142841 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="gather" Mar 13 13:10:00 crc kubenswrapper[4786]: E0313 13:10:00.142865 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31b5990-ed7e-47cf-9e0c-397c9344863e" containerName="oc" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.142871 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31b5990-ed7e-47cf-9e0c-397c9344863e" containerName="oc" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.143045 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="copy" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.143064 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31b5990-ed7e-47cf-9e0c-397c9344863e" containerName="oc" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.143075 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c178b0-a334-4f29-a5c5-c7d86e67f16b" containerName="gather" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.143516 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.145390 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.148527 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fg649" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.148777 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.154807 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tlnr\" (UniqueName: \"kubernetes.io/projected/b824904a-01ff-4fe0-821e-8d738b3dda7a-kube-api-access-8tlnr\") pod \"auto-csr-approver-29556790-57bjg\" (UID: \"b824904a-01ff-4fe0-821e-8d738b3dda7a\") " pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.159637 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556790-57bjg"] Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.255783 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tlnr\" (UniqueName: \"kubernetes.io/projected/b824904a-01ff-4fe0-821e-8d738b3dda7a-kube-api-access-8tlnr\") pod \"auto-csr-approver-29556790-57bjg\" (UID: \"b824904a-01ff-4fe0-821e-8d738b3dda7a\") " pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.275985 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tlnr\" (UniqueName: \"kubernetes.io/projected/b824904a-01ff-4fe0-821e-8d738b3dda7a-kube-api-access-8tlnr\") pod \"auto-csr-approver-29556790-57bjg\" (UID: \"b824904a-01ff-4fe0-821e-8d738b3dda7a\") " pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.467301 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:00 crc kubenswrapper[4786]: I0313 13:10:00.870142 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556790-57bjg"] Mar 13 13:10:01 crc kubenswrapper[4786]: I0313 13:10:01.839780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556790-57bjg" event={"ID":"b824904a-01ff-4fe0-821e-8d738b3dda7a","Type":"ContainerStarted","Data":"76890f243d42c215fda0fce2235c43d6270af72da5e5e671857aded8521f39db"} Mar 13 13:10:02 crc kubenswrapper[4786]: I0313 13:10:02.847225 4786 generic.go:334] "Generic (PLEG): container finished" podID="b824904a-01ff-4fe0-821e-8d738b3dda7a" containerID="97991fa9c21f919ddf2c253cb6b6309dd1594a317d96784281ea860df054170b" exitCode=0 Mar 13 13:10:02 crc kubenswrapper[4786]: I0313 13:10:02.847289 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556790-57bjg" event={"ID":"b824904a-01ff-4fe0-821e-8d738b3dda7a","Type":"ContainerDied","Data":"97991fa9c21f919ddf2c253cb6b6309dd1594a317d96784281ea860df054170b"} Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.172158 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.314466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tlnr\" (UniqueName: \"kubernetes.io/projected/b824904a-01ff-4fe0-821e-8d738b3dda7a-kube-api-access-8tlnr\") pod \"b824904a-01ff-4fe0-821e-8d738b3dda7a\" (UID: \"b824904a-01ff-4fe0-821e-8d738b3dda7a\") " Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.320187 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b824904a-01ff-4fe0-821e-8d738b3dda7a-kube-api-access-8tlnr" (OuterVolumeSpecName: "kube-api-access-8tlnr") pod "b824904a-01ff-4fe0-821e-8d738b3dda7a" (UID: "b824904a-01ff-4fe0-821e-8d738b3dda7a"). InnerVolumeSpecName "kube-api-access-8tlnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.416310 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tlnr\" (UniqueName: \"kubernetes.io/projected/b824904a-01ff-4fe0-821e-8d738b3dda7a-kube-api-access-8tlnr\") on node \"crc\" DevicePath \"\"" Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.863780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556790-57bjg" event={"ID":"b824904a-01ff-4fe0-821e-8d738b3dda7a","Type":"ContainerDied","Data":"76890f243d42c215fda0fce2235c43d6270af72da5e5e671857aded8521f39db"} Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.863814 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556790-57bjg" Mar 13 13:10:04 crc kubenswrapper[4786]: I0313 13:10:04.863821 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76890f243d42c215fda0fce2235c43d6270af72da5e5e671857aded8521f39db" Mar 13 13:10:05 crc kubenswrapper[4786]: I0313 13:10:05.221509 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556784-77t49"] Mar 13 13:10:05 crc kubenswrapper[4786]: I0313 13:10:05.226693 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556784-77t49"] Mar 13 13:10:05 crc kubenswrapper[4786]: I0313 13:10:05.469928 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c261bdd-397b-4b84-83a9-36c4e1800eb3" path="/var/lib/kubelet/pods/0c261bdd-397b-4b84-83a9-36c4e1800eb3/volumes" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.496200 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9x9td"] Mar 13 13:10:11 crc kubenswrapper[4786]: E0313 13:10:11.497379 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b824904a-01ff-4fe0-821e-8d738b3dda7a" containerName="oc" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.497400 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b824904a-01ff-4fe0-821e-8d738b3dda7a" containerName="oc" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.497573 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b824904a-01ff-4fe0-821e-8d738b3dda7a" containerName="oc" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.498668 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.513186 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9x9td"] Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.625330 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-utilities\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.625379 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-catalog-content\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.625436 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtns6\" (UniqueName: \"kubernetes.io/projected/55693fd3-7203-4c87-a9b4-0e2b8b120117-kube-api-access-vtns6\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.726866 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-utilities\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.726930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-catalog-content\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.726987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtns6\" (UniqueName: \"kubernetes.io/projected/55693fd3-7203-4c87-a9b4-0e2b8b120117-kube-api-access-vtns6\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.727736 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-utilities\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.727971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-catalog-content\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.750181 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtns6\" (UniqueName: \"kubernetes.io/projected/55693fd3-7203-4c87-a9b4-0e2b8b120117-kube-api-access-vtns6\") pod \"redhat-marketplace-9x9td\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:11 crc kubenswrapper[4786]: I0313 13:10:11.844324 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:12 crc kubenswrapper[4786]: I0313 13:10:12.281715 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9x9td"] Mar 13 13:10:12 crc kubenswrapper[4786]: I0313 13:10:12.917356 4786 generic.go:334] "Generic (PLEG): container finished" podID="55693fd3-7203-4c87-a9b4-0e2b8b120117" containerID="0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8" exitCode=0 Mar 13 13:10:12 crc kubenswrapper[4786]: I0313 13:10:12.917418 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9x9td" event={"ID":"55693fd3-7203-4c87-a9b4-0e2b8b120117","Type":"ContainerDied","Data":"0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8"} Mar 13 13:10:12 crc kubenswrapper[4786]: I0313 13:10:12.918575 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9x9td" event={"ID":"55693fd3-7203-4c87-a9b4-0e2b8b120117","Type":"ContainerStarted","Data":"cebce1050a67755f91b8c2c1ee662d6034d3045bb95a57727e43cfc7da7a799b"} Mar 13 13:10:13 crc kubenswrapper[4786]: I0313 13:10:13.926730 4786 generic.go:334] "Generic (PLEG): container finished" podID="55693fd3-7203-4c87-a9b4-0e2b8b120117" containerID="e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33" exitCode=0 Mar 13 13:10:13 crc kubenswrapper[4786]: I0313 13:10:13.926772 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9x9td" event={"ID":"55693fd3-7203-4c87-a9b4-0e2b8b120117","Type":"ContainerDied","Data":"e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33"} Mar 13 13:10:14 crc kubenswrapper[4786]: I0313 13:10:14.935459 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9x9td" event={"ID":"55693fd3-7203-4c87-a9b4-0e2b8b120117","Type":"ContainerStarted","Data":"819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280"} Mar 13 13:10:14 crc kubenswrapper[4786]: I0313 13:10:14.958171 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9x9td" podStartSLOduration=2.351940123 podStartE2EDuration="3.958149528s" podCreationTimestamp="2026-03-13 13:10:11 +0000 UTC" firstStartedPulling="2026-03-13 13:10:12.919231091 +0000 UTC m=+5000.198884538" lastFinishedPulling="2026-03-13 13:10:14.525440496 +0000 UTC m=+5001.805093943" observedRunningTime="2026-03-13 13:10:14.95085597 +0000 UTC m=+5002.230509417" watchObservedRunningTime="2026-03-13 13:10:14.958149528 +0000 UTC m=+5002.237802985" Mar 13 13:10:19 crc kubenswrapper[4786]: I0313 13:10:19.457173 4786 scope.go:117] "RemoveContainer" containerID="5a9bf9a4b6ba3a94305b9a1903db42206c11eec14be819923256e0849a6a50db" Mar 13 13:10:21 crc kubenswrapper[4786]: I0313 13:10:21.845086 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:21 crc kubenswrapper[4786]: I0313 13:10:21.845552 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:21 crc kubenswrapper[4786]: I0313 13:10:21.898490 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:22 crc kubenswrapper[4786]: I0313 13:10:22.018151 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:22 crc kubenswrapper[4786]: I0313 13:10:22.128586 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9x9td"] Mar 13 13:10:23 crc kubenswrapper[4786]: I0313 13:10:23.990385 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9x9td" podUID="55693fd3-7203-4c87-a9b4-0e2b8b120117" containerName="registry-server" containerID="cri-o://819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280" gracePeriod=2 Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.875591 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.904754 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-utilities\") pod \"55693fd3-7203-4c87-a9b4-0e2b8b120117\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.904831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-catalog-content\") pod \"55693fd3-7203-4c87-a9b4-0e2b8b120117\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.904901 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtns6\" (UniqueName: \"kubernetes.io/projected/55693fd3-7203-4c87-a9b4-0e2b8b120117-kube-api-access-vtns6\") pod \"55693fd3-7203-4c87-a9b4-0e2b8b120117\" (UID: \"55693fd3-7203-4c87-a9b4-0e2b8b120117\") " Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.909316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-utilities" (OuterVolumeSpecName: "utilities") pod "55693fd3-7203-4c87-a9b4-0e2b8b120117" (UID: "55693fd3-7203-4c87-a9b4-0e2b8b120117"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.912045 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55693fd3-7203-4c87-a9b4-0e2b8b120117-kube-api-access-vtns6" (OuterVolumeSpecName: "kube-api-access-vtns6") pod "55693fd3-7203-4c87-a9b4-0e2b8b120117" (UID: "55693fd3-7203-4c87-a9b4-0e2b8b120117"). InnerVolumeSpecName "kube-api-access-vtns6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.945734 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55693fd3-7203-4c87-a9b4-0e2b8b120117" (UID: "55693fd3-7203-4c87-a9b4-0e2b8b120117"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.999610 4786 generic.go:334] "Generic (PLEG): container finished" podID="55693fd3-7203-4c87-a9b4-0e2b8b120117" containerID="819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280" exitCode=0 Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.999658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9x9td" event={"ID":"55693fd3-7203-4c87-a9b4-0e2b8b120117","Type":"ContainerDied","Data":"819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280"} Mar 13 13:10:24 crc kubenswrapper[4786]: I0313 13:10:24.999672 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9x9td" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:24.999694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9x9td" event={"ID":"55693fd3-7203-4c87-a9b4-0e2b8b120117","Type":"ContainerDied","Data":"cebce1050a67755f91b8c2c1ee662d6034d3045bb95a57727e43cfc7da7a799b"} Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:24.999714 4786 scope.go:117] "RemoveContainer" containerID="819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.007396 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.007436 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55693fd3-7203-4c87-a9b4-0e2b8b120117-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.007450 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtns6\" (UniqueName: \"kubernetes.io/projected/55693fd3-7203-4c87-a9b4-0e2b8b120117-kube-api-access-vtns6\") on node \"crc\" DevicePath \"\"" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.019196 4786 scope.go:117] "RemoveContainer" containerID="e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.032257 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9x9td"] Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.038159 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9x9td"] Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.054068 4786 scope.go:117] "RemoveContainer" containerID="0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.089852 4786 scope.go:117] "RemoveContainer" containerID="819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280" Mar 13 13:10:25 crc kubenswrapper[4786]: E0313 13:10:25.090529 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280\": container with ID starting with 819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280 not found: ID does not exist" containerID="819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.090560 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280"} err="failed to get container status \"819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280\": rpc error: code = NotFound desc = could not find container \"819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280\": container with ID starting with 819b945764dca36bbd88759c7069534731173c17a9ec17d703041e777f04f280 not found: ID does not exist" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.090579 4786 scope.go:117] "RemoveContainer" containerID="e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33" Mar 13 13:10:25 crc kubenswrapper[4786]: E0313 13:10:25.090818 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33\": container with ID starting with e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33 not found: ID does not exist" containerID="e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.090853 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33"} err="failed to get container status \"e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33\": rpc error: code = NotFound desc = could not find container \"e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33\": container with ID starting with e15d52e29dc891cbf80ae6884be5ea1efdc1349702f4078ae8b4d658f393ea33 not found: ID does not exist" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.090871 4786 scope.go:117] "RemoveContainer" containerID="0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8" Mar 13 13:10:25 crc kubenswrapper[4786]: E0313 13:10:25.091161 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8\": container with ID starting with 0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8 not found: ID does not exist" containerID="0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.091200 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8"} err="failed to get container status \"0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8\": rpc error: code = NotFound desc = could not find container \"0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8\": container with ID starting with 0d5ce29ed11e64f4a0469eb231fbe2294c446147e95978dc04e6b66d9bed4ac8 not found: ID does not exist" Mar 13 13:10:25 crc kubenswrapper[4786]: I0313 13:10:25.452612 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55693fd3-7203-4c87-a9b4-0e2b8b120117" path="/var/lib/kubelet/pods/55693fd3-7203-4c87-a9b4-0e2b8b120117/volumes" Mar 13 13:11:38 crc kubenswrapper[4786]: I0313 13:11:38.169715 4786 patch_prober.go:28] interesting pod/machine-config-daemon-8ncs8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:11:38 crc kubenswrapper[4786]: I0313 13:11:38.170386 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8ncs8" podUID="75da9242-3ddf-4eca-82df-a5fc998b0fdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"